Загрузка страницы

LSTM Networks - The Math of Intelligence (Week 8)

Recurrent Networks can be improved to remember long range dependencies by using whats called a Long-Short Term Memory (LSTM) Cell. Let's build one using just numpy! I'll go over the cell components as well as the forward and backward pass logic.

Code for this video:
https://github.com/llSourcell/LSTM_Networks

Please Subscribe! And like. And comment. Thats what keeps me going.

More learning resources:
https://www.youtube.com/watch?v=ftMq5ps503w
https://www.youtube.com/watch?v=cdLUzrjnlr4
https://www.youtube.com/watch?v=hWgGJeAvLws
http://www.wildml.com/2015/09/recurrent-neural-networks-tutorial-part-1-introduction-to-rnns/
https://iamtrask.github.io/2015/11/15/anyone-can-code-lstm/

Join us in the Wizards Slack channel:
http://wizards.herokuapp.com/

And please support me on Patreon:
https://www.patreon.com/user?u=3191693
Follow me:
Twitter: https://twitter.com/sirajraval
Facebook: https://www.facebook.com/sirajology Instagram: https://www.instagram.com/sirajraval/ Instagram: https://www.instagram.com/sirajraval/
Signup for my newsletter for exciting updates in the field of AI:
https://goo.gl/FZzJ5w
Hit the Join button above to sign up to become a member of my channel for access to exclusive content!

Видео LSTM Networks - The Math of Intelligence (Week 8) канала Siraj Raval
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
9 августа 2017 г. 21:19:20
00:45:03
Яндекс.Метрика