Dr. Nadav Hallak

Research Interests:

  • Continuous optimization: theory analgorithms
  • Nonconvex optimization
  • First order methods
  • Machine learning

Biography

Nadav is currently an Assistant Professor, IEM, the Technion. He was a Postdocoral fellow at LIONS under the guidance of Prof. Volkan Cevher, studying methods and theory related to machine learning problems. Previously, he held a postdoc position at Tel-Aviv university, working under the guidance of Prof. Marc Teboulle on various methods for nonconvex models. Nadav completed his PhD on block-type methods for nonconvex constrained problems, and his M.Sc. on the minimization of smooth functions over sparse symmetric sets, under the supervision of Prof. Amir Beck at the Technion. He has a B.Sc. in Industrial Engineering and Management, and a B.A. in Economics, both from the Technion. His research interests include first order methods, random methods, minmax problems, online learning, or nonconvex problems.

 

Publications (most recent)

 

Regret Minimization in Stochastic Non-Convex Learning via a Proximal-Gradient Approach

N. Hallak; P. Mertikopoulos; V. Cevher 

2021. International Conference on Machine Learning (ICML), ELECTR NETWORK, Jul 18-24, 2021.

Finding Second-Order Stationary Points in Constrained Minimization: A Feasible Direction Approach

N. Hallak; M. Teboulle 

Journal Of Optimization Theory And Applications. 2020. Vol. 186, p. 480 – 503. DOI : 10.1007/s10957-020-01713-x.

On the Almost Sure Convergence of Stochastic Gradient Descent in Non-Convex Problems

P. Mertikopoulos; N. Hallak; A. Kavis; V. Cevher 

2020. 34th Conference on Neural Information Processing Systems (NeurIPS 2020), Virtual, December 6-12, 2020.

Efficient Proximal Mapping of the 1-path-norm of Shallow Networks

F. Latorre; P. T. Y. Rolland; S. N. Hallak; V. Cevher 

2020. 37th International Conference on Machine Learning (ICML), Virtual, July 13-18, 2020.