Second-order Optimization for Machine Learning, Made Practical

Speaker:
Tomer Koren - COLLOQUIUM LECTURE
Date:
Tuesday, 5.5.2020, 14:30
Place:
Room 337 Taub Bld.
Affiliation:
School of Computer Science at Tel Aviv University
Host:
Yuval Filmus

Optimization in machine learning, both theoretical and applied, is presently dominated by first-order gradient methods such as stochastic gradient descent. Second-order optimization methods---that involve second-order derivatives and/or second-order statistics of the data---have become far less prevalent despite strong theoretical properties, due to their impractical computation, memory and communication costs. I will present some recent theoretical, algorithmic and infrastructural advances that allow for overcoming these challenges in using second-order methods and obtaining significant performance gains in practice, at very large scale, and on highly-parallel computing architectures. Short Bio: =========== Tomer Koren is an Assistant Professor in the School of Computer Science at Tel Aviv University since Fall 2019. Previously, he was a Senior Research Scientist at Google Brain, Mountain View. He received his PhD in December 2016 from the Technion - Israel Institute of Technology, where his advisor was Prof. Elad Hazan. His research interests are in machine learning and optimization. ===================================== Rereshments will be served from 14:15 Lecture starts at 14:30

Back to the index of events