Maximum Likelihood Estimate by Automatic Differentiation | Bernoulli Distribution
Let's use the capabilities of TensorFlow Probability of automatically computing gradients to solve the MLE optimization problem with a gradient descent optimizer. You can find the notes here: https://raw.githubusercontent.com/Ceyron/machine-learning-and-simulation/main/english/essential_pmf_pdf/bernoulli_maximum_likelihood_estimate_by_automatic_differentiation.pdf
Automatic Differentiation allows for cheap evaluation of derivatives of previously defined algorithmic graphs. We can use this in order to find the Maximum Likelihood Estimate (MLE) by a gradient-based optimization scheme.
Surely, in the case of the Bernoulli, where a simple closed-form solution exists, there is no need for gradient-based optimization. However, using Automatic Differentiation takes away the most complicated part of the MLE - taking the derivative and setting it to zero. Hence, thinking in terms of differentiable programming (and hence Automatic Differentiation) can make hard stuff easy.
-------
📝 : Check out the GitHub Repository of the channel, where I upload all the handwritten notes and source-code files (contributions are very welcome): https://github.com/Ceyron/machine-learning-and-simulation
📢 : Follow me on LinkedIn or Twitter for updates on the channel and other cool Machine Learning & Simulation stuff: https://www.linkedin.com/in/felix-koehler and https://twitter.com/felix_m_koehler
💸 : If you want to support my work on the channel, you can become a Patreon here: https://www.patreon.com/MLsim
-------
⚙️ My Gear:
(Below are affiliate links to Amazon. If you decide to purchase the product or something else on Amazon through this link, I earn a small commission.)
- 🎙️ Microphone: Blue Yeti: https://amzn.to/3NU7OAs
- ⌨️ Logitech TKL Mechanical Keyboard: https://amzn.to/3JhEtwp
- 🎨 Gaomon Drawing Tablet (similar to a WACOM Tablet, but cheaper, works flawlessly under Linux): https://amzn.to/37katmf
- 🔌 Laptop Charger: https://amzn.to/3ja0imP
- 💻 My Laptop (generally I like the Dell XPS series): https://amzn.to/38xrABL
- 📱 My Phone: Fairphone 4 (I love the sustainability and repairability aspect of it): https://amzn.to/3Jr4ZmV
If I had to purchase these items again, I would probably change the following:
- 🎙️ Rode NT: https://amzn.to/3NUIGtw
- 💻 Framework Laptop (I do not get a commission here, but I love the vision of Framework. It will definitely be my next Ultrabook): https://frame.work
As an Amazon Associate I earn from qualifying purchases.
-------
Timestamps:
00:00 Opening
00:17 The Bernoulli Model
00:49 Task of parameter inference
01:14 Log-Likelihood and MLE
02:13 Motivation for Automatic Differentiation
03:27 Reformulation as Minimization
04:20 TFP: Setup
04:41 TFP: Creating a dataset
05:37 TFP: Creating a model distribution with variables
07:09 TFP: Defining the loss
08:17 TFP: The optimization
09:29 TFP: Discussing the result
Видео Maximum Likelihood Estimate by Automatic Differentiation | Bernoulli Distribution канала Machine Learning & Simulation
Automatic Differentiation allows for cheap evaluation of derivatives of previously defined algorithmic graphs. We can use this in order to find the Maximum Likelihood Estimate (MLE) by a gradient-based optimization scheme.
Surely, in the case of the Bernoulli, where a simple closed-form solution exists, there is no need for gradient-based optimization. However, using Automatic Differentiation takes away the most complicated part of the MLE - taking the derivative and setting it to zero. Hence, thinking in terms of differentiable programming (and hence Automatic Differentiation) can make hard stuff easy.
-------
📝 : Check out the GitHub Repository of the channel, where I upload all the handwritten notes and source-code files (contributions are very welcome): https://github.com/Ceyron/machine-learning-and-simulation
📢 : Follow me on LinkedIn or Twitter for updates on the channel and other cool Machine Learning & Simulation stuff: https://www.linkedin.com/in/felix-koehler and https://twitter.com/felix_m_koehler
💸 : If you want to support my work on the channel, you can become a Patreon here: https://www.patreon.com/MLsim
-------
⚙️ My Gear:
(Below are affiliate links to Amazon. If you decide to purchase the product or something else on Amazon through this link, I earn a small commission.)
- 🎙️ Microphone: Blue Yeti: https://amzn.to/3NU7OAs
- ⌨️ Logitech TKL Mechanical Keyboard: https://amzn.to/3JhEtwp
- 🎨 Gaomon Drawing Tablet (similar to a WACOM Tablet, but cheaper, works flawlessly under Linux): https://amzn.to/37katmf
- 🔌 Laptop Charger: https://amzn.to/3ja0imP
- 💻 My Laptop (generally I like the Dell XPS series): https://amzn.to/38xrABL
- 📱 My Phone: Fairphone 4 (I love the sustainability and repairability aspect of it): https://amzn.to/3Jr4ZmV
If I had to purchase these items again, I would probably change the following:
- 🎙️ Rode NT: https://amzn.to/3NUIGtw
- 💻 Framework Laptop (I do not get a commission here, but I love the vision of Framework. It will definitely be my next Ultrabook): https://frame.work
As an Amazon Associate I earn from qualifying purchases.
-------
Timestamps:
00:00 Opening
00:17 The Bernoulli Model
00:49 Task of parameter inference
01:14 Log-Likelihood and MLE
02:13 Motivation for Automatic Differentiation
03:27 Reformulation as Minimization
04:20 TFP: Setup
04:41 TFP: Creating a dataset
05:37 TFP: Creating a model distribution with variables
07:09 TFP: Defining the loss
08:17 TFP: The optimization
09:29 TFP: Discussing the result
Видео Maximum Likelihood Estimate by Automatic Differentiation | Bernoulli Distribution канала Machine Learning & Simulation
Показать
Комментарии отсутствуют
Информация о видео
18 марта 2021 г. 0:31:56
00:10:29
Другие видео канала
![Adjoint Sensitivities over nonlinear equation with JAX Automatic Differentiation](https://i.ytimg.com/vi/XffFjSMLSkM/default.jpg)
![Deriving the Weak Form for Linear Elasticity in Structural Mechanics](https://i.ytimg.com/vi/Z-FnP2myvKw/default.jpg)
![Scientific Python Tutorial Workshop | Part 3 | Scikit-Learn & a bit of TensorFlow](https://i.ytimg.com/vi/KQ-3KtgZMO4/default.jpg)
![Wavenumber 128 Kolmogorov Flow | Turbulence Simulation](https://i.ytimg.com/vi/ePQvTl76Fkg/default.jpg)
![Update: Machine Learning & Simulation now has a Patreon Page](https://i.ytimg.com/vi/4p4SXQur29Q/default.jpg)
![Wavenumber 64 Kolmogorov Flow | Turbulence Simulation](https://i.ytimg.com/vi/hqg-4DUWwFQ/default.jpg)
![Binomial Distribution | Intuition & Introduction | w\ example in TensorFlow Probability](https://i.ytimg.com/vi/57TU00lA3Wc/default.jpg)
![Posterior & MAP for the Categorical | Full Derivation | example in TensorFlow Probability](https://i.ytimg.com/vi/uTl7iQq2N-E/default.jpg)
![Finite Element Method in FEniCS: 1D Transient Heat Diffusion in detail](https://i.ytimg.com/vi/A25_fAf4kxc/default.jpg)
![Scalar Multiplication - Pullback/vJp rule](https://i.ytimg.com/vi/ho8v1FpoaEg/default.jpg)
![What is a Jacobian-Vector product (jvp) in JAX?](https://i.ytimg.com/vi/caoeihy9kLo/default.jpg)
![Simple KS solver in JAX](https://i.ytimg.com/vi/dcmXYa2qyy8/default.jpg)
![Linear Regression in High Dimensions - Deriving the matrix-valued Least-Squares Loss](https://i.ytimg.com/vi/jXVvsa58aoQ/default.jpg)
![Maximum Likelihood Estimate for the One-Hot Categorical | TensorFlow Probability](https://i.ytimg.com/vi/qfoJnrhjTm8/default.jpg)
![Maximum A Posteriori Estimate (MAP) for Bernoulli | Derivation & TensorFlow Probability](https://i.ytimg.com/vi/-g-yQTFmkGI/default.jpg)
![Sampling the Multivariate Normal distribution | example in Python](https://i.ytimg.com/vi/DSWM7-9gK7s/default.jpg)
![Deformation Gradient | Continuum Mechanics | with simple examples](https://i.ytimg.com/vi/R4bcgX1RofE/default.jpg)
![Posterior for the Bernoulli using the Conjugate Prior | with example in TensorFlow Probability](https://i.ytimg.com/vi/2q7TNduIhiw/default.jpg)
![Nonlinear System Solve - Pullback/vJp rule](https://i.ytimg.com/vi/jxaT5b6Eqjs/default.jpg)
![Scalar Addition - Pushforward/Jvp rule](https://i.ytimg.com/vi/PwSaD50jTv8/default.jpg)
![Matrix-Vector product - Pullback/vJp rule](https://i.ytimg.com/vi/lqIhocjJLUc/default.jpg)