organizer | Jean Barbier |
office | http://plan.epfl.ch/?room=INR139 |
phone | +41 21 6938111 |
[email protected] | |
room | INR-113 |
time | 3:00pm |
Special Announcements
Objectives
After a quick introduction in order to familiarise ourselves with some of basic notions of deep neural nets, we will read and discuss a sequence of fairly recent papers. Our emphasis will be on papers that explain why deep neural networks work rather than on papers that apply neural nets to various problems. If you have suggestions what to read, please mail Jean. Also, if you are willing to present one of the papers, we are always looking for volunteers.
Detailed Schedule
Date | Topics Covered | Presenter |
---|---|---|
April 7 | Introduction | Marco Mondelli (Notes) |
April 13 | Introduction (bis) | Marco Mondelli (Notes) |
April 20 | Representation Power of Neural Nets | Olivier Leveque (Notes) |
April 27 | Representation Power of Neural Nets (bis) | Olivier Leveque |
May 4 | Ascension | |
May 11 | Iterative Optimization of Neural Nets (ter) | Olivier Leveque |
May 18 | Break | |
May 25 | Train Faster, Generalize Better | Ruediger Urbanke |
June 2 | Train Faster, Generalize Better (bis) | Ruediger Urbanke |
Resources
Michael Nielsen online tutorial on neural nets
“Approximation by Superpositions of a Sigmoidal Function” by G. Cybenko
“Universal Approximation Bounds for Superpositions of a Sigmoidal Function” by A. Barron
“Provable bounds for learning some deep representations” by S. Arora, A. Bhaskara, R. Ge, T. Ma
“Stability and generalisation” by O. Bousquet and A. Elisseeff