Arthur Gretton - Generalized Energy-Based Models
Abstract: I will introduce Generalized Energy Based Models (GEBM) for generative modelling. These models combine two trained components: a base distribution (generally an implicit model), which can learn the support of data with low intrinsic dimension in a high dimensional space; and an energy function, to refine the probability mass on the learned support. Both the energy function and base jointly constitute the final model, unlike GANs, which retain only the base distribution (the "generator"). In particular, while the energy function is analogous to the GAN critic function, it is not discarded after training. GEBMs are trained by alternating between learning the energy and the base. We show that both training stages are well-defined: the energy is learned by maximising a generalized likelihood, and the resulting energy-based loss provides informative gradients for learning the base. Samples from the posterior on the latent space of the trained model can be obtained via MCMC, thus finding regions in this space that produce better quality samples. Empirically, the GEBM samples on image-generation tasks are of much better quality than those from the learned generator alone, indicating that all else being equal, the GEBM will outperform a GAN of the same complexity. GEBMs also return state-of-the-art performance on density modelling tasks, and when using base measures with an explicit form.
Speaker: Arthur Gretton is a Professor at the Gatsby Computational Neuroscience Unit and Director of the Centre for Computational Statistics and Machine Learning, at University College London. His personal website can be found at http://www.gatsby.ucl.ac.uk/~gretton/.
This talk was given at Secondmind Labs, as a part of our (virtual) research seminar. Our research seminar is where we exchange ideas with guest speakers, keeping you up to date with the latest developments and inspiring research topics. Occasionally, Secondmind researchers present their own work as well. You can find a complete list of speakers at https://www.secondmind.ai/labs/seminars/. Learn more about Secondmind Labs at https://www.secondmind.ai/labs/
Видео Arthur Gretton - Generalized Energy-Based Models канала Secondmind
Speaker: Arthur Gretton is a Professor at the Gatsby Computational Neuroscience Unit and Director of the Centre for Computational Statistics and Machine Learning, at University College London. His personal website can be found at http://www.gatsby.ucl.ac.uk/~gretton/.
This talk was given at Secondmind Labs, as a part of our (virtual) research seminar. Our research seminar is where we exchange ideas with guest speakers, keeping you up to date with the latest developments and inspiring research topics. Occasionally, Secondmind researchers present their own work as well. You can find a complete list of speakers at https://www.secondmind.ai/labs/seminars/. Learn more about Secondmind Labs at https://www.secondmind.ai/labs/
Видео Arthur Gretton - Generalized Energy-Based Models канала Secondmind
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
![Sebastian Farquhar - Unbiased Active Learning and Testing](https://i.ytimg.com/vi/MHHZS6Wi8Ts/default.jpg)
![Emtiyaz Khan - Bayesian Principles for Machine Learning](https://i.ytimg.com/vi/1ZOzPFrbFWs/default.jpg)
![World Summit AI Roundtable - Making Sense of Data (Part One)](https://i.ytimg.com/vi/7P9LAkxZTFM/default.jpg)
![Roberto Calandra - Bayesian optimization for robotics](https://i.ytimg.com/vi/u38wYL6D8PY/default.jpg)
![Ítalo Gomes Gonçalves - Variational Gaussian processes for spatial modeling: the geoML project](https://i.ytimg.com/vi/JDdPZRqtyLg/default.jpg)
![Antonio Del Rio Chanona - Multi-Fidelity Bayesian Optimization in Chemical Engineering](https://i.ytimg.com/vi/qT9ju4eMLKA/default.jpg)
![François-Xavier Briol - Bayesian Estimation of Integrals: A Multi-task Approach](https://i.ytimg.com/vi/7NBrUJcyL7w/default.jpg)
![Peter Stone - Efficient Robot Skill Learning](https://i.ytimg.com/vi/qzMvLEviihM/default.jpg)
![Luigi Nardi - Harnessing new information in Bayesian optimization](https://i.ytimg.com/vi/-huaWITLyE8/default.jpg)
![Andrew G. Wilson - How do we build models that learn and generalize?](https://i.ytimg.com/vi/GvylV2KkXf0/default.jpg)
![M. E. Taylor - Reinforcement Learning in the Real-world: How to “cheat” and still feel good about it](https://i.ytimg.com/vi/KOHEefx3izY/default.jpg)
![Arno Solin - Stationary Activations for Uncertainty Calibration in Deep Learning](https://i.ytimg.com/vi/G_PVRL_wxIE/default.jpg)
![Aryan Deshwal - Bayesian Optimization over Combinatorial Structures](https://i.ytimg.com/vi/22MgClgFyHk/default.jpg)
![Vincent Adam - Sparse methods for markovian GPs](https://i.ytimg.com/vi/-Iw4whJsAhg/default.jpg)
![François Bachoc - Sequential construction and dimension reduction of GP under inequality constraints](https://i.ytimg.com/vi/SpGrecIO6o0/default.jpg)
![World Summit AI Roundtable - Making Sense of Data (Part Two)](https://i.ytimg.com/vi/PC6T8ccEcH0/default.jpg)
![Pablo Moreno-Muñoz - Model Recycling with Gaussian Processes](https://i.ytimg.com/vi/QuEmEXrnFZk/default.jpg)
![Mojmír Mutný - Optimal Experiment Design in Markov Chains](https://i.ytimg.com/vi/o59lLu8yAUM/default.jpg)
![Christopher Nemeth - Coin Sampling: Gradient-Based Bayesian Inference without Learning Rates](https://i.ytimg.com/vi/clhieMkVdI0/default.jpg)
![José Miguel Hernández-Lobato - Probabilistic Methods for Increased Robustness in Machine Learning](https://i.ytimg.com/vi/4ppFiyXJkiM/default.jpg)
![Frank Hutter - Towards Deep Learning 2.0: Going to the Meta-Level](https://i.ytimg.com/vi/RFncTuZIcac/default.jpg)