Hyperbolic Deep Learning [in Russian]
Slides: http://bayesgroup.github.io/bmml_sem/2020/Kochurov_Hyperbolic%20Deep%20Learning.pdf
Chat: https://drive.google.com/file/d/1n4afS4WvHwGDsiaUHPWDuddc5Px_BGva/view?usp=sharing
Hyperbolic Deep Learning gained attention due to its ability to work with and represent hierarchical relations. However, we do not yet have enough tools to work in non-Euclidean space. Several works present proof of concept results on various tasks: word embeddings, text classification, node classification, link prediction, and others. Methods discussed include Hyperbolic GloVe, GRU, VAE, graph embeddings, and graph neural networks. These works introduce new concepts and link Euclidean models to their Hyperbolic extensions. While having fairly simple baselines, they provide some evidence where Hyperbolic geometry might be more suitable.
Видео Hyperbolic Deep Learning [in Russian] канала BayesGroup.ru
Chat: https://drive.google.com/file/d/1n4afS4WvHwGDsiaUHPWDuddc5Px_BGva/view?usp=sharing
Hyperbolic Deep Learning gained attention due to its ability to work with and represent hierarchical relations. However, we do not yet have enough tools to work in non-Euclidean space. Several works present proof of concept results on various tasks: word embeddings, text classification, node classification, link prediction, and others. Methods discussed include Hyperbolic GloVe, GRU, VAE, graph embeddings, and graph neural networks. These works introduce new concepts and link Euclidean models to their Hyperbolic extensions. While having fairly simple baselines, they provide some evidence where Hyperbolic geometry might be more suitable.
Видео Hyperbolic Deep Learning [in Russian] канала BayesGroup.ru
Показать
Комментарии отсутствуют
Информация о видео
24 апреля 2020 г. 21:44:05
01:28:39
Другие видео канала
Scalable Bayesian Inference in Low-Dimensional SubspacesAutoformer and Autoregressive Denoising Diffusion Models for Time Series Forecasting [in Russian]Stochastic computational graphs: optimization and applications in NLP, Maksim Kretov[DeepBayes2018]: Day 2, lecture 4. Discrete latent variablesSparse Bayesian Variational Learning with Matrix Normal DistributionsTensor Train Decomposition for Fast Learning in Large Scale Gaussian Process Models, Dmitry KropotovСлучайные матрицы: теория и приложенияNeural Program Synthesis, part 2 [in Russian]Tensor Programs, part 2 [in Russian]Discovering Faster Matrix Multiplication Algorithms with Reinforcement Learning [in Russian]Learning Differential Equations that are easy to solve [in Russian]SketchBoost: быстрый бустинг для multiclass/multilabel классификации и multitask регрессии[DeepBayes2018]: Day 3, Practical session 5. Distributional reinforcement learningOn Power Laws in Deep Ensembles [in Russian][DeepBayes2018]: Day 2, practical session 5. Variational autoencoders[DeepBayes2019]: Day 5, Sponsor talkMathematical Models of the Genetic Architecture in Complex Human DisordersControlling GANs Latent Space [in Russian][DeepBayes2019]: Day 2, practical session 2. Variational autoencodersPredicting Oil Movement in a Development System using Deep Latent Dynamics Models