Publications

Infinite-width limit of deep linear neural networks

L. Chizat; M. Colombo; X. Fernandez-Real; A. Figalli 

Communications On Pure And Applied Mathematics. 2024-05-06. DOI : 10.1002/cpa.22200.

Deep Learning Theory Through the Lens of Diagonal Linear Networks

S. W. Pesme / N. H. B. Flammarion (Dir.)  

Lausanne, EPFL, 2024. 

Displacement smoothness of entropic optimal transport

G. Carlier; L. Chizat; M. Laborde 

Esaim-Control Optimisation And Calculus Of Variations. 2024-04-09. Vol. 30, p. 25. DOI : 10.1051/cocv/2024013.

On the Effect of Initialization: The Scaling Path of 2-Layer Neural Networks

S. J. Neumayer; L. Chizat; M. Unser 

Journal Of Machine Learning Research. 2024-01-01. Vol. 25, p. 15.

Random matrix methods for high-dimensional machine learning models

A. P. M. Bodin / N. Macris (Dir.)  

Lausanne, EPFL, 2024. 

Fundamental Limits in Statistical Learning Problems: Block Models and Neural Networks

E. Cornacchia / E. Abbé (Dir.)  

Lausanne, EPFL, 2023. 

On the symmetries in the dynamics of wide two-layer neural networks

K. Hajjar; L. Chizat 

Electronic Research Archive. 2023-01-01. Vol. 31, num. 4, p. 2175-2212. DOI : 10.3934/era.2023112.

Filtered data and eigenfunction estimators for statistical inference of multiscale and interacting diffusion processes

A. Zanoni / F. Nobile; G. A. Pavliotis (Dir.)  

Lausanne, EPFL, 2022. 

Loss landscape and symmetries in Neural Networks

M. Geiger / M. Wyart (Dir.)  

Lausanne, EPFL, 2021.