Probabilistic Dimensional Reduction with Gaussian Process Latent Variable Model
Google Tech Talks
February 12, 2007
ABSTRACT
Density modelling in high dimensions is a very difficult problem. Traditional approaches, such as mixtures of Gaussians, typically fail to capture the structure of data sets in high dimensional spaces. In this talk we will argue that for many data sets of interest, the data can be represented as a lower dimensional manifold immersed in the higher dimensional space. We will then present the Gaussian Process Latent Variable Model (GP-LVM), a non-linear probabilistic variant of principal component analysis (PCA) which implicitly assumes that the data lies on a lower dimensional space.
Having introduced the GP-LVM we will review extensions to the algorithm, including dynamics, learning of large data sets and back constraints. We will demonstrate the application of the model and its extensions to a range of data sets, including human motion data, a vowel data set and a robot mapping problem.
Brief bio:
Neil Lawrence is a Senior Research Fellow in the School of Computer Science at the University of Manchester, U.K.. Previous to this appointment he was a Senior Lecturer in the Department of Computer Science at the University of Sheffield, U.K. where he was head of the Machine Learning Research Group. His main research interest is machine learning through probabilistic models. He is interested in both the algorithmic side of these models and their application in areas such as bioinformatics, speech, vision and graphics.
His PhD was awarded in 2000 from the Computer Lab at the University of Cambridge. He then spent a year at Microsoft Research, Cambridge before moving to Sheffield in 2001 and then to Manchester in 2007.
Google engEDU
speaker: Neil Lawrence
Видео Probabilistic Dimensional Reduction with Gaussian Process Latent Variable Model канала GoogleTalksArchive
February 12, 2007
ABSTRACT
Density modelling in high dimensions is a very difficult problem. Traditional approaches, such as mixtures of Gaussians, typically fail to capture the structure of data sets in high dimensional spaces. In this talk we will argue that for many data sets of interest, the data can be represented as a lower dimensional manifold immersed in the higher dimensional space. We will then present the Gaussian Process Latent Variable Model (GP-LVM), a non-linear probabilistic variant of principal component analysis (PCA) which implicitly assumes that the data lies on a lower dimensional space.
Having introduced the GP-LVM we will review extensions to the algorithm, including dynamics, learning of large data sets and back constraints. We will demonstrate the application of the model and its extensions to a range of data sets, including human motion data, a vowel data set and a robot mapping problem.
Brief bio:
Neil Lawrence is a Senior Research Fellow in the School of Computer Science at the University of Manchester, U.K.. Previous to this appointment he was a Senior Lecturer in the Department of Computer Science at the University of Sheffield, U.K. where he was head of the Machine Learning Research Group. His main research interest is machine learning through probabilistic models. He is interested in both the algorithmic side of these models and their application in areas such as bioinformatics, speech, vision and graphics.
His PhD was awarded in 2000 from the Computer Lab at the University of Cambridge. He then spent a year at Microsoft Research, Cambridge before moving to Sheffield in 2001 and then to Manchester in 2007.
Google engEDU
speaker: Neil Lawrence
Видео Probabilistic Dimensional Reduction with Gaussian Process Latent Variable Model канала GoogleTalksArchive
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
![Dimensionality Reduction: PCA and Gauss. Proc. Factor Analysis, by Frederic Simard](https://i.ytimg.com/vi/7BUHYpNVT5E/default.jpg)
![ML Tutorial: Probabilistic Dimensionality Reduction, Part 1/2 (Neil Lawrence)](https://i.ytimg.com/vi/6JjadzltwfA/default.jpg)
![MLSS 2012: J. Cunningham - Gaussian Processes for Machine Learning (Part 1)](https://i.ytimg.com/vi/BS4Wd5rwNwE/default.jpg)
![Gaussian Processes 1 - Philipp Hennig - MLSS 2013 Tübingen](https://i.ytimg.com/vi/50Vgw11qn0o/default.jpg)
![026 EM for Probabilistic PCA](https://i.ytimg.com/vi/jFFSKXwEOLY/default.jpg)
![025 Probabilistic PCA](https://i.ytimg.com/vi/6z6yipdfe3o/default.jpg)
![Deep Probabilistic Modelling with Gaussian Processes - Neil D. Lawrence - NIPS Tutorial 2017](https://i.ytimg.com/vi/NHTGY8VCinY/default.jpg)
![Neil Lawrence: Introduction to Gaussian Processes](https://i.ytimg.com/vi/ewJ3AxKclOg/default.jpg)
![Demystifying Dimensionality Reduction](https://i.ytimg.com/vi/YzqjassagUQ/default.jpg)
![Gaussian Processes Part I - Neil Lawrence - MLSS 2015 Tübingen](https://i.ytimg.com/vi/S9RbSCpy_pg/default.jpg)
![Gaussian Mixture Models - The Math of Intelligence (Week 7)](https://i.ytimg.com/vi/JNlEIEwe-Cg/default.jpg)
![(ML 19.1) Gaussian processes - definition and first examples](https://i.ytimg.com/vi/vU6AiEYED9E/default.jpg)
![undergraduate machine learning 16: Principal Component Analysis - PCA](https://i.ytimg.com/vi/4wRWnsIcGf0/default.jpg)
![(ML 16.6) Gaussian mixture model (Mixture of Gaussians)](https://i.ytimg.com/vi/Rkl30Fr2S38/default.jpg)
![Nonparametric Bayesian Methods: Models, Algorithms, and Applications I](https://i.ytimg.com/vi/I7bgrZjoRhM/default.jpg)
![Dimensionality Reduction and its Data Science Applications](https://i.ytimg.com/vi/ATTRZvdqIm0/default.jpg)
![Lecture 46 — Dimensionality Reduction - Introduction | Stanford University](https://i.ytimg.com/vi/yLdOS6xyM_Q/default.jpg)
![A dynamic probabilistic principal components model...](https://i.ytimg.com/vi/DkjtGiEBZtQ/default.jpg)
![What is Latent Class Analysis? by Tarani Chandola](https://i.ytimg.com/vi/gVCYfWZZJZo/default.jpg)
![6.2 Gaussian Process Regression - Machine Learning Class 10-701](https://i.ytimg.com/vi/xP5dBw5cewg/default.jpg)