The tremendous success of machine learning in recent years is largely due to the impressive performances of stochastic optimisation algorithms such as SGD.
Optimal stochastic rates
We study the stochastic variants of the classical optimisation algorithms. We work on understanding and improving these algorithms and also highlight their contrasting behaviour for over-parameterised models.
A. Varre, L. Pillaud-Vivien, N. Flammarion, Last iterate convergence of SGD for Least-Squares in the Interpolation regime, Neurips 2021
A. Dieuleveut, N. Flammarion, F. Bach, Harder, Better, Faster, Stronger Convergence Rates for Least-Squares Regression, JMLR 2017
N.Flammarion, F. Bach, From Averaging to Acceleration, There is Only a Step-size, COLT 2015
Adaptive step-sizes
Even in the classical case of convex optimization, in which convergence rates have been widely studied over the last 30 years and where theory suggests using the averaged iterate and provides optimal choices of learning rates, practitioners still face major challenges: indeed (a) averaging leads to a slower decay during early iterations, (b) learning rates may not adapt to the difficulty of the problem, or may not be robust to constant misspecification. Consequently, the state-of-the-art approach in practice remains to use the final iterate with decreasing step size a/(b + tα) with constants a,b,α. obtained by a tiresome hand-tuning. Overall, there is a desperate need for adaptive algorithms.
S. Pesme, A. Dieuleveut, N. Flammarion, On Convergence-Diagnostic based Step Sizes for Stochastic Gradient Descent, ICML 2020
Optimisation on Riemannian manifolds
Optimisation problems defined on Riemannian manifolds encompass fundamental (Euclidean) non-convex problems such as principal components analysis (PCA), dictionary learning, and low-rank matrix completion. Naive approaches to handling non-trivial geometries are suboptimal even in simple cases. However, the geometric perspective provides a powerful lens through which to view a variety of important problems.
Y. Sun, N. Flammarion, M. Fazel, Escaping from saddle points on Riemannian manifolds, Neurips 2019
N. Tripuraneni, N. Flammarion, F. Bach, M. I Jordan, Averaging stochastic gradient descent on Riemannian manifolds, COLT 2018