Загрузка страницы

Complete Statistical Theory of Learning (Vladimir Vapnik) | MIT Deep Learning Series

Lecture by Vladimir Vapnik in January 2020, part of the MIT Deep Learning Lecture Series.
Slides: http://bit.ly/2ORVofC
Associated podcast conversation: https://www.youtube.com/watch?v=bQa7hpUpMzM
Series website: https://deeplearning.mit.edu
Playlist: http://bit.ly/deep-learning-playlist

OUTLINE:
0:00 - Introduction
0:46 - Overview: Complete Statistical Theory of Learning
3:47 - Part 1: VC Theory of Generalization
11:04 - Part 2: Target Functional for Minimization
27:13 - Part 3: Selection of Admissible Set of Functions
37:26 - Part 4: Complete Solution in Reproducing Kernel Hilbert Space (RKHS)
53:16 - Part 5: LUSI Approach in Neural Networks
59:28 - Part 6: Examples of Predicates
1:10:39 - Conclusion
1:16:10 - Q&A: Overfitting
1:17:18 - Q&A: Language

CONNECT:
- If you enjoyed this video, please subscribe to this channel.
- Twitter: https://twitter.com/lexfridman
- LinkedIn: https://www.linkedin.com/in/lexfridman
- Facebook: https://www.facebook.com/lexfridman
- Instagram: https://www.instagram.com/lexfridman

Видео Complete Statistical Theory of Learning (Vladimir Vapnik) | MIT Deep Learning Series канала Lex Fridman
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
15 февраля 2020 г. 20:11:09
01:19:21
Яндекс.Метрика