About Me
I'm a Member of Technical Staff at Liquid AI, where I work on pretraining and quantization. I earned my master's and bachelor's degrees in Computer Science from Stanford University.
My technical interests span foundation model development and applications of AI in domain-specific agents and systems. I spent the last few years exploring problems across reasoning, pretraining, ML systems, human alignment, diffusion, and scalable engineering.
Reach me at [firstname][lastname]@gmail.com.
Work
LFM2 Technical Report
Technical report for Liquid Foundation Models 2.
arXiv preprint, 2025
33 authors in alphabetical order
Simple, Scalable Reasoning via Iterated Summarization
A method for scaling language model reasoning over long contexts.
ICML 2025 Workshop on Long Context Foundation Models
ICML 2025 Workshop on AI for Math
* Equal contribution