Research Interests:
|
Biography
I recieved my bachelor diploma in Mathematics from National and Kapodistrian University of Athens, Greece and my master in Optimization and Game Theory in Paris 6 (UPMC) in Paris, France.
Finally, I received my Ph.D. in Adaptive First Order Methods for Convex structured optimization problems from the University of Grenoble-Alpes under the supervision of Prof. P. Mertikopoulos and. Prof. E.V. Belmega.
Currently, I am a postdoctoral researcher at the Laboratory for Information and Inference Systems hosted by Prof. Volkan Cehver.
My interests lie at the intersection of Game Theory, Convex Optimization and Variational Inequalities.
Publications with LIONS
Universal Gradient Methods for Stochastic Convex Optimization
2024. 41st International Conference on Machine Learning (ICML 2024), Vienna, Austria, 2024-07-21.Improving SAM Requires Rethinking its Optimization Formulation
2024. 41st International Conference on Machine Learning (ICML 2024), Vienna, Austria, July 21-27, 2024.On the Generalization of Stochastic Gradient Descent with Momentum
Journal Of Machine Learning Research. 2024. Vol. 25, p. 1 – 56.Distributed Extra-Gradient With Optimal Complexity And Communication Guarantees
2023. 11th International Conference on Learning Representations (ICLR), Kigali, Rwanda, May 1-5, 2023.ADAGRAD Avoids Saddle Points
2022. 38th International Conference on Machine Learning (ICML), Baltimore, MD, Jul 17-23, 2022. p. 731 – 771.Extra Newton: A First Approach to Noise-Adaptive Accelerated Second-Order Methods
2022. 36th Conference on Neural Information Processing Systems (NeurIPS), New Orleans, Louisianna, USA, November 28-December 9, 2022.UNDERGRAD: A Universal Black-Box Optimization Method with Almost Dimension-Free Convergence Rate Guarantees
2022. 39th International Conference on Machine Learning (ICML), Baltimore, Maryland, USA, July 17-23, 2022.Adaptive Stochastic Variance Reduction for Non-convex Finite-Sum Minimization
2022. 36th Conference on Neural Information Processing Systems (NeurIPS), New Orleans, Louisianna, USA, November 28-December 9, 2022.No-regret learning in games with noisy feedback: Faster rates and adaptivity via learning rate separation
2022. 36th Conference on Neural Information Processing Systems (NeurIPS 2022), New Orleans, USA, November 28 – December 9, 2022.Sifting through the Noise: Universal First-Order Methods for Stochastic Variational Inequalities
2021. NeurIPS 2021 : Thirty-fifth Conference on Neural Information Processing Systems, Sydney, Australia [Virtual only], December 6-14, 2021.Publication prior to EPFL
K. Antonakopoulos and P. Mertikopoulos, “Adaptive First-Order Methods Revisited: Convex Minimization without Lipschitz Requirements,” in NeurIPS 2021: Proceedings of the 35th International Conference on Neural Processing Information Systems, 2021
D.-Q. Vu, K. Antonakopoulos and P. Mertikopoulos, “Fast Routing in an Uncertain World: Adaptive Learn- ing in Congestion Games via Exponential Weights,” in NeurIPS 2021: Proceedings of the 35th International Conference on Neural Processing Information Systems, 2021.
Y.-G. Hsieh, K. Antonakopoulos and P. Mertikopoulos,“Adaptive Learning in Continuous Games: Optimal Regret Bounds and Convergence to Equilibrium,” in COLT 2021: Proceedings of 34th Annual Conference on Learning Theory, 2021.
K. Antonakopoulos, E.V. Belmega and P. Mertikopoulos, “Adaptive Extra-Gradient Methods for Min-Max Optimization and Games,” in ICLR 2021: Proceedings of the 9th International Conference on Learning Repre- sentations, 2021.
K. Antonakopoulos, E.V. Belmega and P. Mertikopoulos, “Online and Stochastic Optimization Beyond Lips- chitz Continuity: A Riemannian Approach,” in ICLR 2020: Proceedings of the 8th International Conference on Learning Representations, 2020.
K. Antonakopoulos, E.V. Belmega and P. Mertikopoulos, “An Adaptive Mirror-Prox Method for Variational Inequalities with Singular Operators,” in NeurIPS 2019: Proceedings of the 33rd International Conference on Neural Processing Information Systems, 2019.