Загрузка страницы

Rasa Algorithm Whiteboard - Attention 3: Multi Head Attention

This is the third video on attention mechanisms. In the previous video we introduced keys, queries and values and in this video we're introducing the concept of multiple heads.

The colab notebook that contains the interactive visualisation can be found here:

https://colab.research.google.com/github/tensorflow/tensor2tensor/blob/master/tensor2tensor/notebooks/hello_t2t.ipynb#scrollTo=T7UJzFf6fmhp

We're going at it step by step, but if you're interested in immediately reading all about it in full detail then we might recommend these online documents:

- http://www.peterbloem.nl/blog/transformers
- http://jalammar.github.io/illustrated-transformer/
- http://d2l.ai/chapter_attention-mechanisms/attention.html

The general github repo for this playlist can be found here: https://github.com/RasaHQ/algorithm-whiteboard-resources.

Видео Rasa Algorithm Whiteboard - Attention 3: Multi Head Attention канала Rasa
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
4 мая 2020 г. 13:31:18
00:10:56
Яндекс.Метрика