Загрузка страницы

Programming LSTM with Keras and TensorFlow (10.2)

Long Short Term Memory (LSTM) and Gated Recurrent Units (GRU) are two layer types commonly used to build recurrent neural networks in Keras. This video introduces these two network types as a foundation towards Natural Language Processing (NLP) and time series prediction.

Code for This Video:
https://github.com/jeffheaton/t81_558_deep_learning/blob/master/t81_558_class_10_2_lstm.ipynb
Course Homepage: https://sites.wustl.edu/jeffheaton/t81-558/

Follow Me/Subscribe:
https://www.youtube.com/user/HeatonResearch
https://github.com/jeffheaton
https://twitter.com/jeffheaton

Support Me on Patreon: https://www.patreon.com/jeffheaton

Видео Programming LSTM with Keras and TensorFlow (10.2) канала Jeff Heaton
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
23 июля 2019 г. 22:00:06
00:27:53
Яндекс.Метрика