Загрузка страницы

Practical 4.1 – RNN forward and backward

Recurrent Neural Networks – Forward and backward
Full project: https://github.com/Atcold/torch-Video-Tutorials

Notes:
13:22 – x[t] is concatenated with h[t−1]; at least it is written in green...
21:28 – Not quite. The unrolling number T represents the hierarchy you want to use for processing your input and does not necessary need to be equal or greater to the max length of your matching sequence in order to capture those relationships, given that the state is preserved across sequences chunks (h[3].new_sequence = h[3].previous_sequence), and not zeroed.

Видео Practical 4.1 – RNN forward and backward канала Alfredo Canziani
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
17 октября 2016 г. 20:28:42
00:24:16
Яндекс.Метрика