Загрузка страницы

Lars Ruthotto: "Deep Neural Networks Motivated By Differential Equations (Part 2/2)"

Watch part 1/2 here: https://youtu.be/G2n2nJnh5kc

Machine Learning for Physics and the Physics of Learning Tutorials 2019

"Deep Neural Networks Motivated By Differential Equations (Part 2/2)"
Lars Ruthotto, Emory University

Abstract: In this short course, we establish the connection between residual neural networks and differential equations. We will use this interpretation to relate learning problems in data science to optimal control and parameter estimation problems in physics, engineering, and image processing. The course consists of two lectures. In the first lecture, we briefly introduce some learning problems and discuss linear models. We then extend our discussion to nonlinear models, in particular, multi-layer perceptrons and residual neural networks. We demonstrate that even the training of a single-layer neural network leads to a challenging non-convex optimization problem and overview some heuristics such as Variable Projection and stochastic approximation schemes that can effectively train nonlinear models. Finally, we demonstrate challenges associated with deep networks such as their stability and computational costs of training. In the second lecture, we show that residual neural networks can be interpreted as discretizations of a nonlinear time-dependent ordinary differential equation that depends on unknown parameters, i.e., the network weights. We show how this insight has been used, e.g., to study the stability of neural networks, design new architectures, or use established methods from optimal control methods for training ResNets. We extend this view point to convolutional ResNets, which are popular for speech, image, and video data, and connect them to partial differential equations. Finally, we discuss open questions and opportunities for mathematical advances in this area.

Institute for Pure and Applied Mathematics, UCLA
September 10, 2019

For more information: http://www.ipam.ucla.edu/programs/workshops/machine-learning-for-physics-and-the-physics-of-learning-tutorials/

Видео Lars Ruthotto: "Deep Neural Networks Motivated By Differential Equations (Part 2/2)" канала Institute for Pure & Applied Mathematics (IPAM)
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
9 октября 2019 г. 4:22:06
01:06:45
Другие видео канала
Jaafar El-Awady - dislocation in high thermomechanical condition in Additive Manufacturing of AlloysJaafar El-Awady - dislocation in high thermomechanical condition in Additive Manufacturing of AlloysVikram Gavini - Fast, Accurate and Large-scale Ab-initio Calculations for Materials ModelingVikram Gavini - Fast, Accurate and Large-scale Ab-initio Calculations for Materials ModelingBistra Dilkina - Machine Learning for MIP Solving - IPAM at UCLABistra Dilkina - Machine Learning for MIP Solving - IPAM at UCLAAmit Acharya - Slow time-scale behavior of fast microscopic dynamics - IPAM at UCLAAmit Acharya - Slow time-scale behavior of fast microscopic dynamics - IPAM at UCLAEran Rabani - Stochastic Density Functional Theory - IPAM at UCLAEran Rabani - Stochastic Density Functional Theory - IPAM at UCLADeanna Needell - Using Algebraic Factorizations for Interpretable Learning - IPAM at UCLADeanna Needell - Using Algebraic Factorizations for Interpretable Learning - IPAM at UCLAXavier Bresson - Learning to Untangle Genome Assembly Graphs - IPAM at UCLAXavier Bresson - Learning to Untangle Genome Assembly Graphs - IPAM at UCLAJack Gilbert: "Microbiome of the Built Environment"Jack Gilbert: "Microbiome of the Built Environment"John Harrison - Formalization and Automated Reasoning: A Personal and Historical PerspectiveJohn Harrison - Formalization and Automated Reasoning: A Personal and Historical PerspectiveRaymond Clay - Machine Learning in Equation of State and Transport Modeling at Extreme ConditionsRaymond Clay - Machine Learning in Equation of State and Transport Modeling at Extreme ConditionsDavid Ceperley - Quantum Monte Carlo and Machine Learning Simulations of Dense HydrogenDavid Ceperley - Quantum Monte Carlo and Machine Learning Simulations of Dense HydrogenRose Yu - Incorporating Symmetry for Learning Spatiotemporal Dynamics - IPAM at UCLARose Yu - Incorporating Symmetry for Learning Spatiotemporal Dynamics - IPAM at UCLAYongsoo Yang - Neural network-assisted atomic electron tomography - IPAM at UCLAYongsoo Yang - Neural network-assisted atomic electron tomography - IPAM at UCLAAlbert Fannjiang - From Tomographic Phase Retrieval to Projection Tomography - IPAM at UCLAAlbert Fannjiang - From Tomographic Phase Retrieval to Projection Tomography - IPAM at UCLAThomas Swinburne - Learning uncertainty-aware models of defect kinetics at scale - IPAM at UCLAThomas Swinburne - Learning uncertainty-aware models of defect kinetics at scale - IPAM at UCLAKevin Kelly - Machine Learning Enhanced Compressive Hyperspectral Imaging - IPAM at UCLAKevin Kelly - Machine Learning Enhanced Compressive Hyperspectral Imaging - IPAM at UCLADemetri Psaltis - Machine Learning for 3D Optical Imaging - IPAM at UCLADemetri Psaltis - Machine Learning for 3D Optical Imaging - IPAM at UCLAPaola Gori-Giorgi - Large-coupling strength expansion in DFT and Hartree-Fock adiabatic connectionsPaola Gori-Giorgi - Large-coupling strength expansion in DFT and Hartree-Fock adiabatic connectionsBohua Zhan - Verifying symbolic computation in the HolPy theorem prover - IPAM at UCLABohua Zhan - Verifying symbolic computation in the HolPy theorem prover - IPAM at UCLAXiantao Li - A stochastic algorithm for self-consistent calculations in DFT - IPAM at UCLAXiantao Li - A stochastic algorithm for self-consistent calculations in DFT - IPAM at UCLAPascal Van Hentenryck - Fusing Machine Learning and Optimization - IPAM at UCLAPascal Van Hentenryck - Fusing Machine Learning and Optimization - IPAM at UCLA
Яндекс.Метрика