Atılım Güneş Baydin: Probabilistic Programming for Inverse Problems in the Physical Sciences
Machine learning enables new approaches to inverse problems in many fields of science. We present a novel probabilistic programming framework that couples directly to existing scientific simulators through a cross-platform probabilistic execution protocol, which allows general-purpose inference engines to record and control random number draws within simulators in a language-agnostic way. The execution of existing simulators as probabilistic programs enables highly interpretable posterior inference in the structured model defined by the simulator code base. We demonstrate the technique in particle physics, on a scientifically accurate simulation of the tau lepton decay, which is a key ingredient in establishing the properties of the Higgs boson. Inference efficiency is achieved via amortized inference where a deep recurrent neural network is trained to parameterize proposal distributions and control the stochastic simulator in a sequential importance sampling scheme, at a fraction of the computational cost of a Markov chain Monte Carlo baseline.
Видео Atılım Güneş Baydin: Probabilistic Programming for Inverse Problems in the Physical Sciences канала Oxford ML and Physics Seminars
Видео Atılım Güneş Baydin: Probabilistic Programming for Inverse Problems in the Physical Sciences канала Oxford ML and Physics Seminars
Показать
Комментарии отсутствуют
Информация о видео
21 января 2021 г. 23:14:34
01:11:23
Другие видео канала
![Mike Walmsley: Galaxy Zoo(m): Probabilistic Galaxy Morphology via Bayesian CNNs and Active Learning](https://i.ytimg.com/vi/RNiZMrjH9EI/default.jpg)
![Arvind Neelakantan: Text and Code Embeddings](https://i.ytimg.com/vi/W5KJkFYUEUk/default.jpg)
![Rachel Prudden: Probabilistic modelling for atmospheric science: beyond the noise](https://i.ytimg.com/vi/hRVqjOapj94/default.jpg)
![Ricardo Vinuesa: Artificial Intelligence, Computational Fluid Dynamics, and Sustainability](https://i.ytimg.com/vi/afO8e7knqD8/default.jpg)
![Ard Louis: Deep neural networks have an inbuilt Occam’s razor](https://i.ytimg.com/vi/tzXhbZhNLoU/default.jpg)
![Guillaume Lample: Deep Learning for Symbolic Mathematics](https://i.ytimg.com/vi/jMx_68VxKiA/default.jpg)
![Brian Spears: Cognitive Simulation: combining simulation and experiment with artificial intelligence](https://i.ytimg.com/vi/oHMhxqkDegg/default.jpg)
![Eliu Huerta: AI for Science: Let’s talk business](https://i.ytimg.com/vi/lgmROvsz9WI/default.jpg)
![Ben Nachman: Extracting the most from collider data with deep learning](https://i.ytimg.com/vi/nrhu2rMK28c/default.jpg)
![Laure Zanna: Climate Modeling in the Age of Machine Learning](https://i.ytimg.com/vi/roOx01QmS20/default.jpg)
![Tim Green: Highly accurate protein structure prediction with AlphaFold](https://i.ytimg.com/vi/PqK1Oi9Pr10/default.jpg)
![Peter Dueben: Machine learning for weather predictions](https://i.ytimg.com/vi/V2JAaD-qx4U/default.jpg)
![David Spergel: Determining the Universe’s Initial Conditions](https://i.ytimg.com/vi/xYqN5dZjIQY/default.jpg)
![Adrien Gaidon: Self-supervised 3D vision](https://i.ytimg.com/vi/3YZNiM2l8Z0/default.jpg)
![Maurizio Pierini: Doing more with less: Deep Learning for Physics at the Large Hadron Collider](https://i.ytimg.com/vi/tzsNkhkaexE/default.jpg)
![Michael Kagan: Generative Model Based Design Optimization and Unfolding](https://i.ytimg.com/vi/Ytlb6ux9wMU/default.jpg)
![Séamus Davis: Machine learning in electronic-quantum-matter imaging experiments](https://i.ytimg.com/vi/hVwWQF2envw/default.jpg)
![Stéphane Mallat: Hamiltonian Estimations by Conditional Renormalisation Group and Convolution Nets](https://i.ytimg.com/vi/3LvWKwGQXPk/default.jpg)
![Phiala Shanahan: Provably exact sampling for first-principles theoretical physics](https://i.ytimg.com/vi/J2zpdmh0EOI/default.jpg)
![Jonas Buchli & Federico Felici: Magnetic control of tokamak plasmas with deep reinforcement learning](https://i.ytimg.com/vi/Oesjkp5tSkA/default.jpg)