- Популярные видео
- Авто
- Видео-блоги
- ДТП, аварии
- Для маленьких
- Еда, напитки
- Животные
- Закон и право
- Знаменитости
- Игры
- Искусство
- Комедии
- Красота, мода
- Кулинария, рецепты
- Люди
- Мото
- Музыка
- Мультфильмы
- Наука, технологии
- Новости
- Образование
- Политика
- Праздники
- Приколы
- Природа
- Происшествия
- Путешествия
- Развлечения
- Ржач
- Семья
- Сериалы
- Спорт
- Стиль жизни
- ТВ передачи
- Танцы
- Технологии
- Товары
- Ужасы
- Фильмы
- Шоу-бизнес
- Юмор
Peng Chen - Projected Variational Methods for High-dimensional Bayesian Inference
This talk was part of the Workshop on "PDE-constrained Bayesian inverse problems: interplay of spatial statistical models with advanced PDE discretizations" held at the ESI May 16 to 20, 2022.
Bayesian inference provides an optimal framework to learn models from data with quantified uncertainty. The dimension of the model parameters is often very high or infinite in many practical applications with models represented by, e.g., differential equations or deep neural networks. It is a longstanding challenge to accurately and efficiently solve high-dimensional Bayesian inference problems due to the curse of dimensionality—the computational complexity grows rapidly (often exponentially) with respect to the parameter dimension. In this talk, I will present a class of transport-based projected variational methods to tackle the curse of dimensionality. We project the high-dimensional parameters to intrinsically low-dimensional data-informed subspaces and employ transport-based variational methods to push samples drawn from the prior to a projected posterior. I will present error bounds for the projected posterior distribution measured in Kullback–Leibler divergence. Numerical experiments will be presented to demonstrate the properties of our methods, including improved accuracy, fast convergence with complexity independent of the parameter dimension and the number of samples, strong parallel scalability in processor cores, and weak data scalability in data dimension.
Видео Peng Chen - Projected Variational Methods for High-dimensional Bayesian Inference канала Erwin Schrödinger International Institute for Mathematics and Physics (ESI)
Bayesian inference provides an optimal framework to learn models from data with quantified uncertainty. The dimension of the model parameters is often very high or infinite in many practical applications with models represented by, e.g., differential equations or deep neural networks. It is a longstanding challenge to accurately and efficiently solve high-dimensional Bayesian inference problems due to the curse of dimensionality—the computational complexity grows rapidly (often exponentially) with respect to the parameter dimension. In this talk, I will present a class of transport-based projected variational methods to tackle the curse of dimensionality. We project the high-dimensional parameters to intrinsically low-dimensional data-informed subspaces and employ transport-based variational methods to push samples drawn from the prior to a projected posterior. I will present error bounds for the projected posterior distribution measured in Kullback–Leibler divergence. Numerical experiments will be presented to demonstrate the properties of our methods, including improved accuracy, fast convergence with complexity independent of the parameter dimension and the number of samples, strong parallel scalability in processor cores, and weak data scalability in data dimension.
Видео Peng Chen - Projected Variational Methods for High-dimensional Bayesian Inference канала Erwin Schrödinger International Institute for Mathematics and Physics (ESI)
Комментарии отсутствуют
Информация о видео
Другие видео канала













