Controlling GANs Latent Space [in Russian]
Speaker: Vadim Titov, MIPT, AIRI
Slides: https://github.com/bayesgroup/bayesgroup.github.io/blob/master/bmml_sem/2021/Titov_Ganlatent.pdf
Modern GAN architectures generate highly realistic images in a variety of domains. Much recent works have focused on understanding how its latent space is connected with generated image semantic. It is discovered that there exist meaningful latent manipulations that allow to semantically edit image. Many proposed methods include supervision from pretrained models which is a strong limitation. This weakness could be eliminated by unsupervised methods that have their own disadvantages.
In this talk we will describe the main directions (supervised, unsupervised, text-guided) and current state-of-the-art methods of semantic image manipulation through GAN latent space.
Видео Controlling GANs Latent Space [in Russian] канала BayesGroup.ru
Slides: https://github.com/bayesgroup/bayesgroup.github.io/blob/master/bmml_sem/2021/Titov_Ganlatent.pdf
Modern GAN architectures generate highly realistic images in a variety of domains. Much recent works have focused on understanding how its latent space is connected with generated image semantic. It is discovered that there exist meaningful latent manipulations that allow to semantically edit image. Many proposed methods include supervision from pretrained models which is a strong limitation. This weakness could be eliminated by unsupervised methods that have their own disadvantages.
In this talk we will describe the main directions (supervised, unsupervised, text-guided) and current state-of-the-art methods of semantic image manipulation through GAN latent space.
Видео Controlling GANs Latent Space [in Russian] канала BayesGroup.ru
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
Scalable Bayesian Inference in Low-Dimensional SubspacesAutoformer and Autoregressive Denoising Diffusion Models for Time Series Forecasting [in Russian]Stochastic computational graphs: optimization and applications in NLP, Maksim Kretov[DeepBayes2018]: Day 2, lecture 4. Discrete latent variablesSparse Bayesian Variational Learning with Matrix Normal DistributionsTensor Train Decomposition for Fast Learning in Large Scale Gaussian Process Models, Dmitry KropotovСлучайные матрицы: теория и приложенияNeural Program Synthesis, part 2 [in Russian]Hyperbolic Deep Learning [in Russian]Tensor Programs, part 2 [in Russian]Discovering Faster Matrix Multiplication Algorithms with Reinforcement Learning [in Russian]Learning Differential Equations that are easy to solve [in Russian]SketchBoost: быстрый бустинг для multiclass/multilabel классификации и multitask регрессии[DeepBayes2018]: Day 3, Practical session 5. Distributional reinforcement learningOn Power Laws in Deep Ensembles [in Russian][DeepBayes2019]: Day 5, Sponsor talkMathematical Models of the Genetic Architecture in Complex Human Disorders[DeepBayes2019]: Day 2, practical session 2. Variational autoencodersPredicting Oil Movement in a Development System using Deep Latent Dynamics ModelsDomain Adaptation of GANs [in Russian]