Yann LeCun | May 18, 2021 | The Energy-Based Learning Model
Title: The Energy-Based Learning Model
Speaker: Yann LeCun
Abstract: One of the hottest sub-topics of machine learning in recent times has been Self-Supervised Learning (SSL). In SSL, a learning machine captures the dependencies between input variables, some of which may be observed, denoted X, and others not always observed, denoted Y. SSL pre-training has revolutionized natural language processing and is making very fast progress in speech and image recognition. SSL may enable machines to learn predictive models of the world through observation, and to learn representations of the perceptual world, thereby reducing the number of labeled samples or rewarded trials to learn a downstream task. In the Energy-Based Model framework (EBM), both X and Y are inputs, and the model outputs a scalar energy that measures the degree of incompatibility between X and Y. EBMs are implicit functions that can represent complex and multimodal dependencies between X and Y. EBM architectures belong to two main families: joint embedding architectures and latent-variable generative architectures. There are two main families of methods to train EBMs: contrastive methods, and volume regularization methods. Much of the underlying mathematics of EBM is borrowed from statistical physics, including concepts of partition function, free energy, and variational approximations thereof.
Видео Yann LeCun | May 18, 2021 | The Energy-Based Learning Model канала Mathematical Picture Language
Speaker: Yann LeCun
Abstract: One of the hottest sub-topics of machine learning in recent times has been Self-Supervised Learning (SSL). In SSL, a learning machine captures the dependencies between input variables, some of which may be observed, denoted X, and others not always observed, denoted Y. SSL pre-training has revolutionized natural language processing and is making very fast progress in speech and image recognition. SSL may enable machines to learn predictive models of the world through observation, and to learn representations of the perceptual world, thereby reducing the number of labeled samples or rewarded trials to learn a downstream task. In the Energy-Based Model framework (EBM), both X and Y are inputs, and the model outputs a scalar energy that measures the degree of incompatibility between X and Y. EBMs are implicit functions that can represent complex and multimodal dependencies between X and Y. EBM architectures belong to two main families: joint embedding architectures and latent-variable generative architectures. There are two main families of methods to train EBMs: contrastive methods, and volume regularization methods. Much of the underlying mathematics of EBM is borrowed from statistical physics, including concepts of partition function, free energy, and variational approximations thereof.
Видео Yann LeCun | May 18, 2021 | The Energy-Based Learning Model канала Mathematical Picture Language
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
Zhenghan Wang|June 20, 2022|Spaces of Hamiltonians, symmetries of topological order, & Floquet codesBill Helton | Mar 23, 2021 | Noncommutative real algebraic geometry and quantum gamesStefaan Vaes | Feb 2, 2021 | W*-rigidity paradigms for embeddings of II1 factorsPeter Love | March 21, 2023 | Contextual Subspace Variational Quantum EigensolverXun Gao | May 11, 2021 | Understanding Linear Cross Entropy Benchmark thru Statistical Physics ModelYury Polyanskiy | Feb. 22, 2022 | Uniqueness of BP fixed point for Ising modelsDaniel Loss | June 20, 2022 | From Fractional Spin to Topological MagnonsAvi Wigderson | Sept 15, 2020 | Optimization, Complexity and Math (Prove P!=NP by gradient descent?)Eric Carlen | Nov. 1, 2022| Quantum Entropy Inequalities and Reversible Quantum Markov Semigroups...Daniel Spielman | October 31, 2023 | Discrepancy Theory and Randomized Controlled TrialsMarius Junge | June 23, 2022 | Entropy decay in large open systemsMatthew Hastings | June 16, 2020 | The power of adiabatic quantum computation with no sign problemXiao-Gang Wen | June 20, 2022 | Phases and Phase transition from categorical symmetryHaribabu Arthanari | June 23, 2022 | Computational Drug DiscoveryZhenghan Wang | Oct 6, 2020 | Reconstructing CFTs from TQFTsJacob Fox | February 21, 2024 | Subset SumsElizabeth Crosson | June 9, 2020 | The sign problem and its relation to the spectral gapVaughan Jones | July 21, 2020 | Applied von Neumann algebraYury Polyanskiy | June 21, 2022 | Information propagation in low dimensionsLiang Kong|Nov 3, 2020|On the classification of topological orders with finite internal symmetriesRoy J. Garcia | Nov. 15, 2022 | Resource theory of quantum scrambling