Загрузка страницы

Tensor Train Decomposition for Fast Learning in Large Scale Gaussian Process Models, Dmitry Kropotov

Gaussian Process models is a popular Bayesian approach for solving different machine learning problems, including regression, classification and structured prediction. Training full GP model scales cubically with training set size thus preventing efficient learning in case of large datasets. For this case an inducing inputs approach is usually used that scales linearly with training set size and cubically with number of inducing inputs. Empirical evaluation shows that ability to work with quite a small number of inducing inputs leads to poor performance of GP models in case of large number of features. In this talk, we discuss a learning procedure for GP models that allows using much larger number of inducing inputs. This procedure can be interpreted as a fast variational inference scheme with several approximations made for variational distribution. One of them uses Tensor Train format – a popular approach for compact storing and fast operating with multidimensional tensors.

Slides: https://bayesgroup.github.io/bmml_sem/2017/TT-GP.pdf

Видео Tensor Train Decomposition for Fast Learning in Large Scale Gaussian Process Models, Dmitry Kropotov канала BayesGroup.ru
Показать
Страницу в закладки Мои закладки
Все заметки Новая заметка Страницу в заметки