Загрузка страницы

Back propagation through Cross Entropy and Softmax

#maths #machinelearning #deeplearning #neuralnetworks #derivatives #gradientdescent #deeplearning #backpropagation

In this video, I will surgically dissect back-propagation through cross-entropy error function and softmax output unit.

1:38 Describing a Simple Neural Network
9:20 Numerical Example of Cross-Entropy Error Function
15:05 The Big Picture of What We are Going to Do!
17:50 Actual Back-Propagation Derivation Starts!

The link to the blog post: https://www.mldawn.com/back-propagation-with-cross-entropy-and-softmax/

You can visit our Website: https://www.mldawn.com/

You can follow us on Twitter: https://twitter.com/MLDawn2018

You can join us on Linked In: https://www.linkedin.com/in/mehran-bazargani-14b352176/

You can join us on Facebook: https://www.facebook.com/ml.dawn.3

Keep up the good work and good luck!

Видео Back propagation through Cross Entropy and Softmax канала MLDawn 2018
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
26 мая 2020 г. 14:31:58
00:53:33
Яндекс.Метрика