Can learning theory resist deep learning? Francis Bach, INRIA
Machine learning algorithms are ubiquitous in most scientific, industrial and personal
domains, with many successful applications. As a scientific field, machine learning has
always been characterized by the constant exchanges between theory and practice, with a
stream of algorithms that exhibit both good empirical performance on real-world problems
and some form of theoretical guarantees. Many of the recent and well publicized
applications come from deep learning, where these exchanges are harder to make, in part
because the objective functions used to train neural networks are not convex. In this talk, I
will present recent results on the global convergence of gradient descent for some specific
non-convex optimization problems, illustrating these difficulties and the associated pitfalls
(joint work with Lénaïc Chizat and Edouard Oyallon).
---
Recent years have witnessed an increased cross-fertilisation between the fields of statistics and computer science. In the era of Big Data, statisticians are increasingly facing the question of guaranteeing prescribed levels of inferential accuracy within certain time budget. On the other hand, computer scientists are progressively modelling data as noisy measurements coming from an underlying population, exploiting the statistical regularities of the data to save on computation.
This cross-fertilisation has led to the development and understanding of many of the algorithmic paradigms that underpin modern machine learning, including gradient descent methods and generalisation guarantees, implicit regularisation strategies, high-dimensional statistical models and algorithms.
About the event
This event will bring together experts to talk about advances at the intersection of statistics and computer science in machine learning. This two-day conference will focus on the underlying theory and the links with applications, and will feature 12 talks by leading international researchers.
The intended audience is faculty, postdoctoral researchers and Ph.D. students from the UK/EU, in order to introduce them to this area of research and to the Turing.
Видео Can learning theory resist deep learning? Francis Bach, INRIA канала The Alan Turing Institute
domains, with many successful applications. As a scientific field, machine learning has
always been characterized by the constant exchanges between theory and practice, with a
stream of algorithms that exhibit both good empirical performance on real-world problems
and some form of theoretical guarantees. Many of the recent and well publicized
applications come from deep learning, where these exchanges are harder to make, in part
because the objective functions used to train neural networks are not convex. In this talk, I
will present recent results on the global convergence of gradient descent for some specific
non-convex optimization problems, illustrating these difficulties and the associated pitfalls
(joint work with Lénaïc Chizat and Edouard Oyallon).
---
Recent years have witnessed an increased cross-fertilisation between the fields of statistics and computer science. In the era of Big Data, statisticians are increasingly facing the question of guaranteeing prescribed levels of inferential accuracy within certain time budget. On the other hand, computer scientists are progressively modelling data as noisy measurements coming from an underlying population, exploiting the statistical regularities of the data to save on computation.
This cross-fertilisation has led to the development and understanding of many of the algorithmic paradigms that underpin modern machine learning, including gradient descent methods and generalisation guarantees, implicit regularisation strategies, high-dimensional statistical models and algorithms.
About the event
This event will bring together experts to talk about advances at the intersection of statistics and computer science in machine learning. This two-day conference will focus on the underlying theory and the links with applications, and will feature 12 talks by leading international researchers.
The intended audience is faculty, postdoctoral researchers and Ph.D. students from the UK/EU, in order to introduce them to this area of research and to the Turing.
Видео Can learning theory resist deep learning? Francis Bach, INRIA канала The Alan Turing Institute
Показать
Комментарии отсутствуют
Информация о видео
29 января 2020 г. 16:19:56
00:42:31
Другие видео канала
Francis Bach: Gradient descent for wide two-layer Neural NetworksDeep Learning: A Crash CourseOn the Global Convergence of Gradient Descent for (...) - Bach - Workshop 3 - CEB T1 2019Sanjeev Arora: Toward Theoretical Understanding of Deep LearningWhat is automated machine learning? | KaggleStanford Seminar - Information Theory of Deep LearningThe Neuroscience of Consciousness – with Anil SethIs Optimization the Right Language to Understand Deep Learning? - Sanjeev AroraLesson 1 - Voltage, Current, Resistance (Engineering Circuit Analysis)Intelligence artificielle par apprentissage automatique (Francis Bach)ICML 2018: Tutorial Session: Toward the Theoretical Understanding of Deep LearningFrancis Bach " Beyond stochastic gradient descent for large-scale machine learning"Quantum Computing for Computer ScientistsLecture 1 | The Perceptron - History, Discovery, and TheoryMSc Mathematical Sciences at the University of YorkFrancis Bach: Large-scale machine learning and convex optimization 2/2How to learn any language in six months | Chris Lonsdale | TEDxLingnanUniversityProbabilistic Models and Machine LearningMIT Godel Escher Bach Lecture 1Francis Bach - Machine learning and optimization for massive data