Mathematical Models of the Genetic Architecture in Complex Human Disorders
Slides: https://bayesgroup.github.io/bmml_sem/2019/Frei_Genetics.pdf
Oleksandr Frei, Researcher at NORMENT (Norwegian Centre for Mental Disorders Research), University of Oslo, Norway
Modern studies on genetics of complex human disorders collect large samples, often exceeding N=10^6 individuals and M=10^7 genetic variants, posing challenging mathematical problems, such as solving a system of linear equations with huge NxM design matrix. In this presentation we will describe the Gaussian Mixture model (MiXeR [1], [2]) and three approaches for estimating its probability density function using (1) random sampling, (2) Fourier convolution, and (3) moment-preserving approximations. Further, we discuss our optimization protocol, based on direct maximization of the likelihood function using differential evolution and Nelder-Mead algorithms. Finally, we derive posterior estimates for some quantities of interest. If time allows we may also discuss related work [3] based on Mixed Linear Models, REML (Restricted Maximum Likelihood) and Variational iteration for Bayesian linear regression with Gaussian mixture prior.
Видео Mathematical Models of the Genetic Architecture in Complex Human Disorders канала BayesGroup.ru
Oleksandr Frei, Researcher at NORMENT (Norwegian Centre for Mental Disorders Research), University of Oslo, Norway
Modern studies on genetics of complex human disorders collect large samples, often exceeding N=10^6 individuals and M=10^7 genetic variants, posing challenging mathematical problems, such as solving a system of linear equations with huge NxM design matrix. In this presentation we will describe the Gaussian Mixture model (MiXeR [1], [2]) and three approaches for estimating its probability density function using (1) random sampling, (2) Fourier convolution, and (3) moment-preserving approximations. Further, we discuss our optimization protocol, based on direct maximization of the likelihood function using differential evolution and Nelder-Mead algorithms. Finally, we derive posterior estimates for some quantities of interest. If time allows we may also discuss related work [3] based on Mixed Linear Models, REML (Restricted Maximum Likelihood) and Variational iteration for Bayesian linear regression with Gaussian mixture prior.
Видео Mathematical Models of the Genetic Architecture in Complex Human Disorders канала BayesGroup.ru
Показать
Комментарии отсутствуют
Информация о видео
31 января 2020 г. 0:07:42
01:28:15
Другие видео канала
Scalable Bayesian Inference in Low-Dimensional SubspacesAutoformer and Autoregressive Denoising Diffusion Models for Time Series Forecasting [in Russian]Stochastic computational graphs: optimization and applications in NLP, Maksim Kretov[DeepBayes2018]: Day 2, lecture 4. Discrete latent variablesSparse Bayesian Variational Learning with Matrix Normal DistributionsTensor Train Decomposition for Fast Learning in Large Scale Gaussian Process Models, Dmitry KropotovСлучайные матрицы: теория и приложенияNeural Program Synthesis, part 2 [in Russian]Hyperbolic Deep Learning [in Russian]Tensor Programs, part 2 [in Russian]Discovering Faster Matrix Multiplication Algorithms with Reinforcement Learning [in Russian]Learning Differential Equations that are easy to solve [in Russian]SketchBoost: быстрый бустинг для multiclass/multilabel классификации и multitask регрессии[DeepBayes2018]: Day 3, Practical session 5. Distributional reinforcement learningOn Power Laws in Deep Ensembles [in Russian][DeepBayes2019]: Day 5, Sponsor talkControlling GANs Latent Space [in Russian][DeepBayes2019]: Day 2, practical session 2. Variational autoencodersPredicting Oil Movement in a Development System using Deep Latent Dynamics ModelsDomain Adaptation of GANs [in Russian]