AM Seminar: Continuous-time Dynamics for Accelerated Optimization

Speaker Name: 
Walid Krichene
Speaker Organization: 
Google Research
Start Time: 
Monday, January 28, 2019 - 4:00pm
End Time: 
Monday, January 28, 2019 - 5:00pm
Location: 
BE 372
Organizer: 
Abhishek Halder

Abstract

Optimization algorithms can be viewed as a discretization of a continuous-time process. This point of view can simplify analysis, lead to connections with physics, and streamline the design of new algorithms.
We will review results from continuous-time optimization, with a focus on accelerated methods. We give a simple interpretation of acceleration, and show how this motivates heuristics (restarting, adaptive averaging) that improve convergence in practice. We will then focus on the stochastic case, and discuss acceleration in the presence of noise. We conclude with a brief review of how the same tools apply to related problems, including online learning, sampling, and non-convex optimization.
 

Bio

Walid Krichene is at Google Research, where he works on large-scale optimization and recommendation. He received his Ph.D. in EECS in 2016 from UC Berkeley where he was advised by Alex Bayen and Peter Bartlett, a M.A. in Math from U.C. Berkeley, and a M.S. in Engineering and Applied Math from the Ecole des Mines Paristech. He received the Leon Chua Award and Outstanding Instructor awards from U.C. Berkeley. His research interests include convex optimization, stochastic approximation, online learning, and representation learning.