Tensor Train Decomposition for Fast Learning in Large Scale Gaussian Process Models, Dmitry Kropotov
Gaussian Process models is a popular Bayesian approach for solving different machine learning problems, including regression, classification and structured prediction. Training full GP model scales cubically with training set size thus preventing efficient learning in case of large datasets. For this case an inducing inputs approach is usually used that scales linearly with training set size and cubically with number of inducing inputs. Empirical evaluation shows that ability to work with quite a small number of inducing inputs leads to poor performance of GP models in case of large number of features. In this talk, we discuss a learning procedure for GP models that allows using much larger number of inducing inputs. This procedure can be interpreted as a fast variational inference scheme with several approximations made for variational distribution. One of them uses Tensor Train format – a popular approach for compact storing and fast operating with multidimensional tensors.
Slides: https://bayesgroup.github.io/bmml_sem/2017/TT-GP.pdf
Видео Tensor Train Decomposition for Fast Learning in Large Scale Gaussian Process Models, Dmitry Kropotov канала BayesGroup.ru
Slides: https://bayesgroup.github.io/bmml_sem/2017/TT-GP.pdf
Видео Tensor Train Decomposition for Fast Learning in Large Scale Gaussian Process Models, Dmitry Kropotov канала BayesGroup.ru
Показать
Комментарии отсутствуют
Информация о видео
18 октября 2017 г. 10:11:43
01:22:59
Другие видео канала
Scalable Bayesian Inference in Low-Dimensional SubspacesAutoformer and Autoregressive Denoising Diffusion Models for Time Series Forecasting [in Russian]Stochastic computational graphs: optimization and applications in NLP, Maksim Kretov[DeepBayes2018]: Day 2, lecture 4. Discrete latent variablesSparse Bayesian Variational Learning with Matrix Normal DistributionsСлучайные матрицы: теория и приложенияNeural Program Synthesis, part 2 [in Russian]Hyperbolic Deep Learning [in Russian]Tensor Programs, part 2 [in Russian]Discovering Faster Matrix Multiplication Algorithms with Reinforcement Learning [in Russian]Learning Differential Equations that are easy to solve [in Russian]SketchBoost: быстрый бустинг для multiclass/multilabel классификации и multitask регрессии[DeepBayes2018]: Day 3, Practical session 5. Distributional reinforcement learningOn Power Laws in Deep Ensembles [in Russian][DeepBayes2019]: Day 5, Sponsor talkMathematical Models of the Genetic Architecture in Complex Human DisordersControlling GANs Latent Space [in Russian][DeepBayes2019]: Day 2, practical session 2. Variational autoencodersPredicting Oil Movement in a Development System using Deep Latent Dynamics ModelsDomain Adaptation of GANs [in Russian]