top of page
Research Interest
I am interested in many things. NLP, understanding language, and machine learning.
My current work focuses on open pretraining science. Model merging, babyLM, collaborative pretraining, efficient scaling laws, and (efficient) evaluation.
I am actively interested in other directions such as: understanding learning and (mech\any) interpretability, generalizations during training, etc.
I am also generally interested in Cognition, Computational Creativity and strong A.I. related fields.
Previously:
I was a part of Project Debater Group, studying argumentation mining and low-resource text classification.
My Ph.D. (HUJI) focused on using linguistic knowledge for text generation evaluation and training.

bottom of page