Positional embeddings in transformers EXPLAINED | Demystifying positional encodings.
What are positional embeddings / encodings?
📺 Follow-up video: Concatenate or add positional encodings? Learned positional embeddings. https://youtu.be/M2ToEXF6Olw
► Outline:
00:00 What are positional embeddings?
03:39 Requirements for positional embeddings
04:23 Sines, cosines explained: The original solution from the “Attention is all you need” paper
📺 Transformer explained: https://youtu.be/FWFA4DGuzSc
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
NEW (channel update):
🔥 Optionally, pay us a coffee to boost our Coffee Bean production! ☕
Patreon: https://www.patreon.com/AICoffeeBreak
Ko-fi: https://ko-fi.com/aicoffeebreak
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
Paper 📄
Vaswani, Ashish, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Łukasz Kaiser, and Illia Polosukhin. "Attention is all you need." In Advances in neural information processing systems, pp. 5998-6008. 2017. https://proceedings.neurips.cc/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf
Music 🎵 :
Discovery Hit by Kevin MacLeod is licensed under a Creative Commons Attribution 4.0 licence. https://creativecommons.org/licenses/by/4.0/
Source: http://incompetech.com/music/royalty-free/index.html?isrc=USUAN1300023
Artist: http://incompetech.com/
---------------------------
🔗 Links:
AICoffeeBreakQuiz: https://www.youtube.com/c/AICoffeeBreak/community
Twitter: https://twitter.com/AICoffeeBreak
Reddit: https://www.reddit.com/r/AICoffeeBreak/
YouTube: https://www.youtube.com/AICoffeeBreak
#AICoffeeBreak #MsCoffeeBean #MachineLearning #AI #research
Видео Positional embeddings in transformers EXPLAINED | Demystifying positional encodings. канала AI Coffee Break with Letitia
📺 Follow-up video: Concatenate or add positional encodings? Learned positional embeddings. https://youtu.be/M2ToEXF6Olw
► Outline:
00:00 What are positional embeddings?
03:39 Requirements for positional embeddings
04:23 Sines, cosines explained: The original solution from the “Attention is all you need” paper
📺 Transformer explained: https://youtu.be/FWFA4DGuzSc
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
NEW (channel update):
🔥 Optionally, pay us a coffee to boost our Coffee Bean production! ☕
Patreon: https://www.patreon.com/AICoffeeBreak
Ko-fi: https://ko-fi.com/aicoffeebreak
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
Paper 📄
Vaswani, Ashish, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Łukasz Kaiser, and Illia Polosukhin. "Attention is all you need." In Advances in neural information processing systems, pp. 5998-6008. 2017. https://proceedings.neurips.cc/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf
Music 🎵 :
Discovery Hit by Kevin MacLeod is licensed under a Creative Commons Attribution 4.0 licence. https://creativecommons.org/licenses/by/4.0/
Source: http://incompetech.com/music/royalty-free/index.html?isrc=USUAN1300023
Artist: http://incompetech.com/
---------------------------
🔗 Links:
AICoffeeBreakQuiz: https://www.youtube.com/c/AICoffeeBreak/community
Twitter: https://twitter.com/AICoffeeBreak
Reddit: https://www.reddit.com/r/AICoffeeBreak/
YouTube: https://www.youtube.com/AICoffeeBreak
#AICoffeeBreak #MsCoffeeBean #MachineLearning #AI #research
Видео Positional embeddings in transformers EXPLAINED | Demystifying positional encodings. канала AI Coffee Break with Letitia
Показать
Комментарии отсутствуют
Информация о видео
12 июля 2021 г. 15:00:00
00:09:40
Другие видео канала
Adding vs. concatenating positional embeddings & Learned positional encodingsTransformer Neural Networks - EXPLAINED! (Attention is all you need)Data-efficient Image Transformers EXPLAINED! Facebook AI's DeiT paper[RANT] Adversarial attack on OpenAI’s CLIP? Are we the fools or the foolers?UMAP explained | The best dimensionality reduction?How craving attention makes you less creative | Joseph Gordon-LevittTransformers can do both images and text. Here is why.An image is worth 16x16 words: ViT | Is this the extinction of CNNs? Long live the Transformer?Say “PCA” and the dimensions go away! – PCA explained with intuition, a little math and codeTransformer in Transformer: Paper explained and visualized | TNTHow TommyInnit Gained 7 Million Subscribers In 1 Year (Genius Strategy)"What Can We Do to Improve Peer Review in NLP?" 👀Pattern Exploiting Training explained! | PET, iPET, ADAPETNVIDIA Jarvis meets Ms. Coffee BeanVisual Guide to Transformer Neural Networks - (Episode 1) Position EmbeddingsFNet: Mixing Tokens with Fourier Transforms – Paper ExplainedThe ultimate intro to Graph Neural Networks. Maybe.I Figured Out the Instagram Algorithm.NLP for Developers: Word Embeddings | RasaOpenAI’s CLIP explained! | Examples, links to code and pretrained model