Загрузка страницы

Long Short Term Memory (LSTM) Networks in 20 minutes

This tutorial will cover the basics of Long Short Term Memory (LSTM) networks which is a type of Recurrent Neural Networks (RNNs).

A Recurrent Neural Network (RNN) contains a temporal loop in which the hidden layer not only gives an output but it feeds itself as well.

RNN can recall what happened in the previous time stamp so it works great with sequence of text.

Long Short Term Memory (LSTM) networks are a certain type of RNNs that work better compared to vanilla RNN since they overcome the vanishing gradient problem.

In practice, RNN fail to establish long term dependencies so LSTM networks are type of RNN that are designed to remember long term dependencies by default.

LSTM contains gates that can allow or block information from passing by. Gates consist of a sigmoid neural net layer along with a pointwise multiplication operation.

Sigmoid output ranges from 0 to 1:

0 = Don’t allow any data to flow
1 = Allow everything to flow!

I hope you enjoy this tutorial!

If you like it, please give it a like and subscribe to the channel for more videos.

Thanks and Happy Learning

Видео Long Short Term Memory (LSTM) Networks in 20 minutes канала Professor Ryan
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
23 сентября 2019 г. 1:02:02
00:18:35
Яндекс.Метрика