Complete Statistical Theory of Learning (Vladimir Vapnik) | MIT Deep Learning Series
Lecture by Vladimir Vapnik in January 2020, part of the MIT Deep Learning Lecture Series.
Slides: http://bit.ly/2ORVofC
Associated podcast conversation: https://www.youtube.com/watch?v=bQa7hpUpMzM
Series website: https://deeplearning.mit.edu
Playlist: http://bit.ly/deep-learning-playlist
OUTLINE:
0:00 - Introduction
0:46 - Overview: Complete Statistical Theory of Learning
3:47 - Part 1: VC Theory of Generalization
11:04 - Part 2: Target Functional for Minimization
27:13 - Part 3: Selection of Admissible Set of Functions
37:26 - Part 4: Complete Solution in Reproducing Kernel Hilbert Space (RKHS)
53:16 - Part 5: LUSI Approach in Neural Networks
59:28 - Part 6: Examples of Predicates
1:10:39 - Conclusion
1:16:10 - Q&A: Overfitting
1:17:18 - Q&A: Language
CONNECT:
- If you enjoyed this video, please subscribe to this channel.
- Twitter: https://twitter.com/lexfridman
- LinkedIn: https://www.linkedin.com/in/lexfridman
- Facebook: https://www.facebook.com/lexfridman
- Instagram: https://www.instagram.com/lexfridman
Видео Complete Statistical Theory of Learning (Vladimir Vapnik) | MIT Deep Learning Series канала Lex Fridman
Slides: http://bit.ly/2ORVofC
Associated podcast conversation: https://www.youtube.com/watch?v=bQa7hpUpMzM
Series website: https://deeplearning.mit.edu
Playlist: http://bit.ly/deep-learning-playlist
OUTLINE:
0:00 - Introduction
0:46 - Overview: Complete Statistical Theory of Learning
3:47 - Part 1: VC Theory of Generalization
11:04 - Part 2: Target Functional for Minimization
27:13 - Part 3: Selection of Admissible Set of Functions
37:26 - Part 4: Complete Solution in Reproducing Kernel Hilbert Space (RKHS)
53:16 - Part 5: LUSI Approach in Neural Networks
59:28 - Part 6: Examples of Predicates
1:10:39 - Conclusion
1:16:10 - Q&A: Overfitting
1:17:18 - Q&A: Language
CONNECT:
- If you enjoyed this video, please subscribe to this channel.
- Twitter: https://twitter.com/lexfridman
- LinkedIn: https://www.linkedin.com/in/lexfridman
- Facebook: https://www.facebook.com/lexfridman
- Instagram: https://www.instagram.com/lexfridman
Видео Complete Statistical Theory of Learning (Vladimir Vapnik) | MIT Deep Learning Series канала Lex Fridman
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
Deep Learning State of the Art (2020)Stuart Russell: Long-Term Future of Artificial Intelligence | Lex Fridman Podcast #9Vladimir Vapnik: Predicates, Invariants, and the Essence of Intelligence | Lex Fridman Podcast #71Tamara Louie: Applying Statistical Modeling & Machine Learning to Perform Time-Series Forecasting2017 Personality 06: Jean Piaget & ConstructivismVitalik Buterin: Ethereum 2.0 | Lex Fridman Podcast #188PAC Learning and VC DimensionModel Complexity and VC DimensionStanford Seminar - Information Theory of Deep LearningThe Statistics Debate! - October, 2020Statistical learning with big data. A talk by Trevor HastieDonald Knuth: P=NP | AI Podcast ClipsYoshua Bengio Guest Talk - Towards Causal Representation LearningDavid Sinclair: Extending the Human Lifespan Beyond 100 Years | Lex Fridman Podcast #189Artificial Intelligence & Machine Learning in the Oil & Gas Industry | Zoom Webinar RecordingLecture 02 - Is Learning Feasible?MIT 6.S191: Recurrent Neural NetworksCommunication Theory & Systems : RONNY HADANIMachine Learning course- Shai Ben-David: Lecture 7Michael I. Jordan: Machine Learning, Recommender Systems, and Future of AI | Lex Fridman Podcast #74