Загрузка страницы

Shawe-Taylor and Rivasplata: Statistical Learning Theory - a Hitchhiker's Guide (NeurIPS 2018)

Abstract: The tutorial will showcase what statistical learning theory aims to assess about and hence deliver for learning systems. We will highlight how algorithms can piggy back on its results to improve the performances of learning algorithms as well as to understand their limitations. The tutorial is aimed at those wishing to gain an understanding of the value and role of statistical learning theory in order to hitch a ride on its results.

Speakers: John Shawe-Taylor and Omar Rivasplata

Slides: https://media.neurips.cc/Conferences/NIPS2018/Slides/stastical_learning_theory.pdf

Видео Shawe-Taylor and Rivasplata: Statistical Learning Theory - a Hitchhiker's Guide (NeurIPS 2018) канала Steven Van Vaerenbergh
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
10 декабря 2018 г. 4:38:03
01:58:09
Другие видео канала
Ian Goodfellow: Adversarial Machine Learning (ICLR 2019 invited talk)Ian Goodfellow: Adversarial Machine Learning (ICLR 2019 invited talk)J. Frankle & M. Carbin: The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural NetworksJ. Frankle & M. Carbin: The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural NetworksWhat Bodies Think About: Bioelectric Computation Outside the Nervous System - NeurIPS 2018What Bodies Think About: Bioelectric Computation Outside the Nervous System - NeurIPS 2018Aki Vehtari: Stan and probabilistic programming (MLSP 2020 tutorial)Aki Vehtari: Stan and probabilistic programming (MLSP 2020 tutorial)Structural Equation Modeling: what is it and what can we use it for? (part 1 of 6)Structural Equation Modeling: what is it and what can we use it for? (part 1 of 6)Ian Goodfellow: Generative Adversarial Networks (NIPS 2016 tutorial)Ian Goodfellow: Generative Adversarial Networks (NIPS 2016 tutorial)That's how Top AI/ML Conference looks (NeurIPS 2019, Vancouver)That's how Top AI/ML Conference looks (NeurIPS 2019, Vancouver)Ali Rahimi - NIPS 2017 Test-of-Time Award presentationAli Rahimi - NIPS 2017 Test-of-Time Award presentationICLR Debate with Leslie Kaelbling (ICLR 2019)ICLR Debate with Leslie Kaelbling (ICLR 2019)Conditional Mean Embeddings for Reinforcement Learning - John Shawe TaylorConditional Mean Embeddings for Reinforcement Learning - John Shawe TaylorStatistical Learning Theory for Modern Machine Learning - John Shawe-TaylorStatistical Learning Theory for Modern Machine Learning - John Shawe-TaylorRandom Matrices: IntroductionRandom Matrices: IntroductionVariational Inference: Foundations and Modern Methods (NIPS 2016 tutorial)Variational Inference: Foundations and Modern Methods (NIPS 2016 tutorial)Yikang Shen: Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks (ICLR2019)Yikang Shen: Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks (ICLR2019)J. Z. Kolter and A. Madry: Adversarial Robustness - Theory and Practice (NeurIPS 2018 Tutorial)J. Z. Kolter and A. Madry: Adversarial Robustness - Theory and Practice (NeurIPS 2018 Tutorial)Fernanda Viégas and Martin Wattenberg: Visualization for Machine Learning (NeurIPS 2018 Tutorial)Fernanda Viégas and Martin Wattenberg: Visualization for Machine Learning (NeurIPS 2018 Tutorial)Theory and Algorithms for Forecasting Non-Stationary Time Series (NIPS 2016 tutorial)Theory and Algorithms for Forecasting Non-Stationary Time Series (NIPS 2016 tutorial)Learn English in 30 Minutes - ALL the English Basics You NeedLearn English in 30 Minutes - ALL the English Basics You NeedSusan Athey: Counterfactual Inference (NeurIPS 2018 Tutorial)Susan Athey: Counterfactual Inference (NeurIPS 2018 Tutorial)Michael Unser: Splines and Machine Learning: From classical RKHS methods to DNN (MLSP 2020 keynote)Michael Unser: Splines and Machine Learning: From classical RKHS methods to DNN (MLSP 2020 keynote)
Яндекс.Метрика