amitattia at mail dot tau dot ac dot il
I'm a PhD student at the Department of Computer Science at Tel Aviv University, advised by Prof. Tomer Koren. I completed my MSc under the supervision of Tomer and before that, obtained a BSc in Computer Science and a minor in Physics from the Hebrew University of Jerusalem.
My research focuses on optimization for machine learning, and in particular on convergence and generalization of first-order methods.
For a list of publications see below or on my google scholar profile.
Optimal Rates in Continual Linear Regression via Increasing Regularization
Ran Levinstein, Amit Attia, Matan Schliserman, Uri Sherman, Tomer Koren, Daniel Soudry, Itay Evron
[arXiv]
Benefits of Learning Rate Annealing for Tuning-Robustness in Stochastic Optimization
Amit Attia, Tomer Koren
[arXiv]
A General Reduction for High-Probability Analysis with General Light-Tailed Distributions
Amit Attia, Tomer Koren
[arXiv]
Faster Stochastic Optimization with Arbitrary Delays via Asynchronous Mini-Batching
Amit Attia, Ofir Gaash, Tomer Koren
ICML 2025 (to appear)
[arXiv]
How Free is Parameter-Free Stochastic Optimization?
Amit Attia, Tomer Koren
ICML 2024 (Spotlight)
[arXiv]
SGD with AdaGrad Stepsizes: Full Adaptivity with High Probability to Unknown Parameters, Unbounded Gradients and Affine Variance
Amit Attia, Tomer Koren
ICML 2023
[arXiv]
Uniform Stability for First-Order Empirical Risk Minimization
Amit Attia, Tomer Koren
COLT 2022
[arXiv]
Algorithmic Instabilities of Accelerated Gradient Descent
Amit Attia, Tomer Koren
NeurIPS 2021
[arXiv]