Aditya Tadimeti
About me
I’m a master’s student in Computer Science at Stanford University, where I also earned my bachelor’s. My technical interests span two distinct axes: advancing core foundation model technologies, and reinventing traditional domains with agentic systems.
Currently, I'm a Machine Learning Research Intern at Liquid AI, where I'm working on the efficient training of foundation models.
Previously:
- At Stanford, I worked with Prof. Noah Goodman on distilling reasoning traces (ICML 2025), and with Prof. Jure Leskovec on large-scale foundation model pretraining.
- At Adobe Firefly, I trained reward models for diffusion alignment.
- At Cohere, I worked on language model reasoning.
- At Amazon (Supply Chain) and Oracle (OCI), I built high-impact engineering tools.
Feel free to reach out: [firstname][lastname]@gmail.com.
Publications
Simple, Scalable Reasoning via Iterated Summarization
ICML 2025 Workshop on Long Context Foundation Models
ICML 2025 Workshop on AI for Math
* Equal contribution