Michael Jordan: "Optimization & Dynamical Systems: Variational, Hamiltonian, & Symplectic Perspe..."
High Dimensional Hamilton-Jacobi PDEs 2020
Workshop II: PDE and Inverse Problem Methods in Machine Learning
"Optimization and Dynamical Systems: Variational, Hamiltonian, and Symplectic Perspectives"
Michael Jordan - University of California, Berkeley (UC Berkeley)
Abstract: We analyze the convergence rate of various momentum-based optimization algorithms from dynamical systems and Hamiltonian points of view. The analysis exploits fundamental topological properties, such as the continuous dependence of iterates on their initial conditions, to provide a simple characterization of convergence rates. In many cases, closed-form expressions are obtained that relate algorithm parameters to the convergence rate. The analysis encompasses discrete time and continuous time, as well as time-invariant and time-variant formulations, and is not limited to a convex or Euclidean setting. In addition, we show why symplectic discretization schemes are important for momentum-based optimization algorithms, and provide a characterization of algorithms that exhibit accelerated convergence. Finally, we discuss recent work on a generalization of symplectic integrators to dissipative Hamiltonian systems that is able to preserve continuous-time rates of convergence up to a controlled error. [Joint work with Michael Muehlebach, Guilherme Franca and Rene Vidal.]
Institute for Pure and Applied Mathematics, UCLA
April 20, 2020
For more information: https://www.ipam.ucla.edu/hjws2
Видео Michael Jordan: "Optimization & Dynamical Systems: Variational, Hamiltonian, & Symplectic Perspe..." канала Institute for Pure & Applied Mathematics (IPAM)
Workshop II: PDE and Inverse Problem Methods in Machine Learning
"Optimization and Dynamical Systems: Variational, Hamiltonian, and Symplectic Perspectives"
Michael Jordan - University of California, Berkeley (UC Berkeley)
Abstract: We analyze the convergence rate of various momentum-based optimization algorithms from dynamical systems and Hamiltonian points of view. The analysis exploits fundamental topological properties, such as the continuous dependence of iterates on their initial conditions, to provide a simple characterization of convergence rates. In many cases, closed-form expressions are obtained that relate algorithm parameters to the convergence rate. The analysis encompasses discrete time and continuous time, as well as time-invariant and time-variant formulations, and is not limited to a convex or Euclidean setting. In addition, we show why symplectic discretization schemes are important for momentum-based optimization algorithms, and provide a characterization of algorithms that exhibit accelerated convergence. Finally, we discuss recent work on a generalization of symplectic integrators to dissipative Hamiltonian systems that is able to preserve continuous-time rates of convergence up to a controlled error. [Joint work with Michael Muehlebach, Guilherme Franca and Rene Vidal.]
Institute for Pure and Applied Mathematics, UCLA
April 20, 2020
For more information: https://www.ipam.ucla.edu/hjws2
Видео Michael Jordan: "Optimization & Dynamical Systems: Variational, Hamiltonian, & Symplectic Perspe..." канала Institute for Pure & Applied Mathematics (IPAM)
Показать
Комментарии отсутствуют
Информация о видео
7 июля 2020 г. 1:56:04
00:48:20
Другие видео канала
Dynamical, symplectic and stochastic perspectives on optimization – Michael Jordan – ICM2018Steve Brunton: "Dynamical Systems (Part 1/2)"Lagrangian Neural Networks | AISCLearning for Safety-Critical Control in Dynamical SystemsLucas Wagner - Compact representations of excited states from QMC, as data for low-energy modelsMajor Achievements of Peter Scholze in chronological order.Introduction to Calculus of VariationsOn Gradient-Based Optimization: Accelerated, Distributed, Asynchronous and StochasticJordan Peterson's Ultimate Advice for Students and College Grads - STOP WASTING TIMEOn Langevin Dynamics in Machine Learning - Michael I. JordanFirst Steps in Symplectic Dynamics - Helmut HoferMichael Mahoney - Dynamical systems and machine learningThe derivative isn't what you think it is.Introduction to Dynamic Optimization: Lecture 1.mp4AI Institute Geometry of Deep Learning 2019 [Workshop] Day 2 | Session 1]The calculus of variations - Gianni Dal Masso - 2015Introduction to Trajectory OptimizationPeter Scholze and Fermat's Last Theorem#036 - Max Welling: Quantum, Manifolds & Symmetries in ML