Backpropagation calculus | Chapter 4, Deep learning
Help fund future projects: https://www.patreon.com/3blue1brown
An equally valuable form of support is to simply share some of the videos.
Special thanks to these supporters: http://3b1b.co/nn3-thanks
Written/interactive form of this series: https://www.3blue1brown.com/topics/neural-networks
This one is a bit more symbol-heavy, and that's actually the point. The goal here is to represent in somewhat more formal terms the intuition for how backpropagation works in part 3 of the series, hopefully providing some connection between that video and other texts/code that you come across later.
For more on backpropagation:
http://neuralnetworksanddeeplearning.com/chap2.html
https://github.com/mnielsen/neural-networks-and-deep-learning
http://colah.github.io/posts/2015-08-Backprop/
Music by Vincent Rubinetti:
https://vincerubinetti.bandcamp.com/album/the-music-of-3blue1brown
------------------
Video timeline
0:00 - Introduction
0:38 - The Chain Rule in networks
3:56 - Computing relevant derivatives
4:45 - What do the derivatives mean?
5:39 - Sensitivity to weights/biases
6:42 - Layers with additional neurons
9:13 - Recap
------------------
3blue1brown is a channel about animating math, in all senses of the word animate. And you know the drill with YouTube, if you want to stay posted on new videos, subscribe, and click the bell to receive notifications (if you're into that): http://3b1b.co/subscribe
If you are new to this channel and want to see more, a good place to start is this playlist: http://3b1b.co/recommended
Various social media stuffs:
Website: https://www.3blue1brown.com
Twitter: https://twitter.com/3Blue1Brown
Patreon: https://patreon.com/3blue1brown
Facebook: https://www.facebook.com/3blue1brown
Reddit: https://www.reddit.com/r/3Blue1Brown
Видео Backpropagation calculus | Chapter 4, Deep learning канала 3Blue1Brown
An equally valuable form of support is to simply share some of the videos.
Special thanks to these supporters: http://3b1b.co/nn3-thanks
Written/interactive form of this series: https://www.3blue1brown.com/topics/neural-networks
This one is a bit more symbol-heavy, and that's actually the point. The goal here is to represent in somewhat more formal terms the intuition for how backpropagation works in part 3 of the series, hopefully providing some connection between that video and other texts/code that you come across later.
For more on backpropagation:
http://neuralnetworksanddeeplearning.com/chap2.html
https://github.com/mnielsen/neural-networks-and-deep-learning
http://colah.github.io/posts/2015-08-Backprop/
Music by Vincent Rubinetti:
https://vincerubinetti.bandcamp.com/album/the-music-of-3blue1brown
------------------
Video timeline
0:00 - Introduction
0:38 - The Chain Rule in networks
3:56 - Computing relevant derivatives
4:45 - What do the derivatives mean?
5:39 - Sensitivity to weights/biases
6:42 - Layers with additional neurons
9:13 - Recap
------------------
3blue1brown is a channel about animating math, in all senses of the word animate. And you know the drill with YouTube, if you want to stay posted on new videos, subscribe, and click the bell to receive notifications (if you're into that): http://3b1b.co/subscribe
If you are new to this channel and want to see more, a good place to start is this playlist: http://3b1b.co/recommended
Various social media stuffs:
Website: https://www.3blue1brown.com
Twitter: https://twitter.com/3Blue1Brown
Patreon: https://patreon.com/3blue1brown
Facebook: https://www.facebook.com/3blue1brown
Reddit: https://www.reddit.com/r/3Blue1Brown
Видео Backpropagation calculus | Chapter 4, Deep learning канала 3Blue1Brown
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
![Master Class Biology 11 Explained Online Reproduction of Animals - Part 2](https://i.ytimg.com/vi/kZ4eS57lzXk/default.jpg)
![Visualization of Data Preparation for a Neural Net (RNN/LSTM/GRU)](https://i.ytimg.com/vi/D6vxZ5KMD18/default.jpg)
![But what is a neural network? | Chapter 1, Deep learning](https://i.ytimg.com/vi/aircAruvnKk/default.jpg)
![Bayes theorem, the geometry of changing beliefs](https://i.ytimg.com/vi/HZGCoVF3YvM/default.jpg)
![MIT 6.S191 (2020): Introduction to Deep Learning](https://i.ytimg.com/vi/njKP3FqW3Sk/default.jpg)
![Visualizing the Riemann zeta function and analytic continuation](https://i.ytimg.com/vi/sD0NjbwqlYw/default.jpg)
![The Mathematics of Machine Learning](https://i.ytimg.com/vi/Rt6beTKDtqY/default.jpg)
![](https://i.ytimg.com/vi/Lk5JP3EfesM/default.jpg)
![But how does bitcoin actually work?](https://i.ytimg.com/vi/bBC-nXj3Ng4/default.jpg)
![Deep Learning State of the Art (2020)](https://i.ytimg.com/vi/0VH1Lim8gL8/default.jpg)
![Pi hiding in prime regularities](https://i.ytimg.com/vi/NaL_Cb42WyY/default.jpg)
![Newton's Fractal (which Newton knew nothing about)](https://i.ytimg.com/vi/-RdOwhmqP5s/default.jpg)
![What is backpropagation really doing? | Chapter 3, Deep learning](https://i.ytimg.com/vi/Ilg3gGewQ5U/default.jpg)
![What's so special about Euler's number e? | Chapter 5, Essence of calculus](https://i.ytimg.com/vi/m2MIpDrF7Es/default.jpg)
![Convolutional Neural Networks (CNNs) explained](https://i.ytimg.com/vi/YRhxdVk_sIs/default.jpg)
![MIT Introduction to Deep Learning | 6.S191](https://i.ytimg.com/vi/5tvmMX8r_OM/default.jpg)
![The essence of calculus](https://i.ytimg.com/vi/WUvTyaaNkzM/default.jpg)
![But what is the Fourier Transform? A visual introduction.](https://i.ytimg.com/vi/spUNpyF58BY/default.jpg)
![The Absolutely Simplest Neural Network Backpropagation Example](https://i.ytimg.com/vi/khUVIZ3MON8/default.jpg)
![Taylor series | Chapter 11, Essence of calculus](https://i.ytimg.com/vi/3d6DsjIBzJ4/default.jpg)