Загрузка страницы

Neural Networks from Scratch - P.7 Calculating Loss with Categorical Cross-Entropy

In order to do backpropagation and optimization, we need to have some measure of how wrong the model is. For this, we use a loss function. In our case, with a softmax classifier, we'll be using categorical cross-entropy.

Next video: https://youtu.be/levekYbxauw

Neural Networks from Scratch book: https://nnfs.io

Playlist for this series: https://www.youtube.com/playlist?list=PLQVvvaa0QuDcjD5BAw2DxE6OF2tius3V3

Channel membership: https://www.youtube.com/channel/UCfzlCWGWYyIQ0aLC5w48gBQ/join
Discord: https://discord.gg/sentdex
Support the content: https://pythonprogramming.net/support-donate/
Twitter: https://twitter.com/sentdex
Instagram: https://instagram.com/sentdex
Facebook: https://www.facebook.com/pythonprogramming.net/
Twitch: https://www.twitch.tv/sentdex

#nnfs #python #neuralnetworks

Видео Neural Networks from Scratch - P.7 Calculating Loss with Categorical Cross-Entropy канала sentdex
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
23 января 2021 г. 20:47:17
00:16:19
Яндекс.Метрика