Загрузка страницы

torch.nn.Embedding explained (+ Character-level language model)

In this video, I will talk about the Embedding module of PyTorch. It has a lot of applications in the Natural language processing field and also when working with categorical variables. I will explain some of its functionalities like the padding index and maximum norm. In the second part of this video I will use the Embedding module to represent characters in an English alphabet and build a text generating model. Once we train the model, we we look into how the character embeddings evolved over epochs.

Code: https://github.com/jankrepl/mildlyoverfitted/tree/master/mini_tutorials/embedding

00:00​​​ Intro
01:23​​​ BERT example
01:56​​ Behavior explained (IPython)
04:25 Intro character-level model
05:29​​​ Dataset implementation
08:53​​​ Network implementation
12:12 Text generating function
14:00 Training script implementation
17:55 Launching and analyze results
18:31 Visualization of results
20:31 Outro

If you have any video suggestions or you just wanna chat feel free to join the discord server: https://discord.gg/a8Va9tZsG5

Twitter: https://twitter.com/moverfitted

Credits logo animation
Title: Conjungation · Author: Uncle Milk · Source: https://soundcloud.com/unclemilk​ · License: https://creativecommons.org/licenses/...​ · Download (9MB): https://auboutdufil.com/?id=600

Видео torch.nn.Embedding explained (+ Character-level language model) канала mildlyoverfitted
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
22 февраля 2021 г. 1:02:04
00:20:47
Яндекс.Метрика