Neural SDEs: Deep Generative Models in the Diffusion Limit - Maxim Raginsky
In deep generative models, the latent variable is generated by a time-inhomogeneous Markov chain, where at each time step we pass the current state through a parametric nonlinear map, such as a feedforward neural net, and add a small independent Gaussian perturbation. In this talk, based on joint work with Belinda Tzen, I will discuss the diffusion limit of such models, where we increase the number of layers while sending the step size and the noise variance to zero. I will first provide a unified viewpoint on both sampling and variational inference in such generative models through the lens of stochastic control. Then I will show how we can quantify the expressiveness of diffusion-based generative models. Specifically, I will prove that one can efficiently sample from a wide class of terminal target distributions by choosing the drift of the latent diffusion from the class of multilayer feedforward neural nets, with the accuracy of sampling measured by the Kullback-Leibler divergence to the target distribution. Finally, I will briefly discuss a scheme for unbiased, finite-variance simulation in such models. This scheme can be implemented as a deep generative model with a random number of layers.
Видео Neural SDEs: Deep Generative Models in the Diffusion Limit - Maxim Raginsky канала Institute for Advanced Study
Видео Neural SDEs: Deep Generative Models in the Diffusion Limit - Maxim Raginsky канала Institute for Advanced Study
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
Classical Turbulence as Quantum Geometry - Alexander MigdalAn Introduction to Lifted Expander Graphs - Fernando Granha JeronimoScholar Spotlight: Anna BokovProof complexity - an introduction - Avi WigdersonStability for functional and geometric inequalities - Robin NeumayerModular bootstrap, Segal's axioms and resolution of Liouville conformal field theory -Rhodes, VargasBook Trailer for "The Usefulness of Useless Knowledge"Progress on Celestial Holography - Andrew StromingerNon-perturbative Studies of JT Gravity and Supergravity using Minimal Strings - Clifford V. JohnsonEpsilon regularity and removable singularities - Karen UhlenbeckDirect and dual Information Bottleneck frameworks for Deep Learning - Tali TishbyEntanglement Wedge Reconstruction in Infinite-Dimensional Hilbert Spaces - Monica Jinwoo KangFree Energy from Replica Wormholes - Netta EngelhardtThe challenges of model-based reinforcement learning and how to overcome them - Csaba SzepesváriSparse matrices in sparse analysis - Anna GilbertDynamics of Deep Neural Networks--A Fourier Analysis Perspective-Yaoyu ZhangEmergent linguistic structure in deep contextual neural word representations - Chris ManningOn the critic function of implicit generative models - Arthur GrettonIntroduction to Interpretable Machine Learning II - Cynthia RudinLagrangians, symplectomorphisms and zeroes of moment maps - Yann RollinBourgain Remembrance - Various Speakers