Загрузка страницы

LSTM Networks - EXPLAINED!

Recurrent neural nets are very versatile. However, they don’t work well for longer sequences. Why is this the case? You’ll understand that now. And we delve into one of the most common Recurrent Neural Network Architectures : LSTM. We also build a text generator in Keras to generate state union speeches.

Code for this video: https://github.com/ajhalthor/Keras_LSTM_Text_Generator

REFERENCES
[1] LSTM Landmark paper (Sepp Hochreiter ): https://www.bioinf.jku.at/publications/older/2604.pdf
[1] Slides from the Deep Learning book for RNNs: https://www.deeplearningbook.org/slides/10_rnn.pdf
[2] Andrej Karpathy’s Blog + Code (You can probably understand more from this now!): http://karpathy.github.io/2015/05/21/rnn-effectiveness/
[3] The Deep learning Book on Sequence Modeling: https://www.deeplearningbook.org/contents/rnn.html
[4] Colah’s blog on LSTMs: http://colah.github.io/posts/2015-08-Understanding-LSTMs/
[6] Visualizing and Understanding RNNs : https://arxiv.org/pdf/1506.02078.pdf

Видео LSTM Networks - EXPLAINED! канала CodeEmporium
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
7 декабря 2018 г. 0:58:26
00:16:12
Яндекс.Метрика