Fastformer: Additive Attention Can Be All You Need | Paper Explained
👨👩👧👦 JOIN OUR DISCORD COMMUNITY:
Discord ► https://discord.gg/peBrCpheKE
📢 SUBSCRIBE TO MY MONTHLY AI NEWSLETTER:
Substack ► https://aiepiphany.substack.com/
❤️ Become The AI Epiphany Patreon ❤️ ► https://www.patreon.com/theaiepiphany
In this video I covered "Fastformer: Additive Attention Can Be All You Need" paper introducing a novel, linear complexity, transformer model!
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
✅ Paper: https://arxiv.org/abs/2108.09084
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
⌚️ Timetable:
00:00 Intro
01:00 Previous work and problems
03:10 Fastformer method explained
07:10 Param sharing and complexity
09:10 Results, Fastformer is effective
11:20 Results, Fastformer is efficient
12:45 Ablations
14:10 Outro
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
💰 BECOME A PATREON OF THE AI EPIPHANY ❤️
If these videos, GitHub projects, and blogs help you,
consider helping me out by supporting me on Patreon!
The AI Epiphany ► https://www.patreon.com/theaiepiphany
One-time donation:
https://www.paypal.com/paypalme/theaiepiphany
Much love! ❤️
Huge thank you to these AI Epiphany patreons:
Eli Mahler
Petar Veličković
Zvonimir Sabljic
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
💡 The AI Epiphany is a channel dedicated to simplifying the field of AI using creative visualizations and in general, a stronger focus on geometrical and visual intuition, rather than the algebraic and numerical "intuition".
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
👋 CONNECT WITH ME ON SOCIAL
LinkedIn ► https://www.linkedin.com/in/aleksagordic/
Twitter ► https://twitter.com/gordic_aleksa
Instagram ► https://www.instagram.com/aiepiphany/
Facebook ► https://www.facebook.com/aiepiphany/
👨👩👧👦 JOIN OUR DISCORD COMMUNITY:
Discord ► https://discord.gg/peBrCpheKE
📢 SUBSCRIBE TO MY MONTHLY AI NEWSLETTER:
Substack ► https://aiepiphany.substack.com/
💻 FOLLOW ME ON GITHUB FOR ML PROJECTS:
GitHub ► https://github.com/gordicaleksa
📚 FOLLOW ME ON MEDIUM:
Medium ► https://gordicaleksa.medium.com/
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
#fastformer #transformers #linearcomplexity
Видео Fastformer: Additive Attention Can Be All You Need | Paper Explained канала Aleksa Gordić - The AI Epiphany
Discord ► https://discord.gg/peBrCpheKE
📢 SUBSCRIBE TO MY MONTHLY AI NEWSLETTER:
Substack ► https://aiepiphany.substack.com/
❤️ Become The AI Epiphany Patreon ❤️ ► https://www.patreon.com/theaiepiphany
In this video I covered "Fastformer: Additive Attention Can Be All You Need" paper introducing a novel, linear complexity, transformer model!
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
✅ Paper: https://arxiv.org/abs/2108.09084
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
⌚️ Timetable:
00:00 Intro
01:00 Previous work and problems
03:10 Fastformer method explained
07:10 Param sharing and complexity
09:10 Results, Fastformer is effective
11:20 Results, Fastformer is efficient
12:45 Ablations
14:10 Outro
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
💰 BECOME A PATREON OF THE AI EPIPHANY ❤️
If these videos, GitHub projects, and blogs help you,
consider helping me out by supporting me on Patreon!
The AI Epiphany ► https://www.patreon.com/theaiepiphany
One-time donation:
https://www.paypal.com/paypalme/theaiepiphany
Much love! ❤️
Huge thank you to these AI Epiphany patreons:
Eli Mahler
Petar Veličković
Zvonimir Sabljic
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
💡 The AI Epiphany is a channel dedicated to simplifying the field of AI using creative visualizations and in general, a stronger focus on geometrical and visual intuition, rather than the algebraic and numerical "intuition".
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
👋 CONNECT WITH ME ON SOCIAL
LinkedIn ► https://www.linkedin.com/in/aleksagordic/
Twitter ► https://twitter.com/gordic_aleksa
Instagram ► https://www.instagram.com/aiepiphany/
Facebook ► https://www.facebook.com/aiepiphany/
👨👩👧👦 JOIN OUR DISCORD COMMUNITY:
Discord ► https://discord.gg/peBrCpheKE
📢 SUBSCRIBE TO MY MONTHLY AI NEWSLETTER:
Substack ► https://aiepiphany.substack.com/
💻 FOLLOW ME ON GITHUB FOR ML PROJECTS:
GitHub ► https://github.com/gordicaleksa
📚 FOLLOW ME ON MEDIUM:
Medium ► https://gordicaleksa.medium.com/
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
#fastformer #transformers #linearcomplexity
Видео Fastformer: Additive Attention Can Be All You Need | Paper Explained канала Aleksa Gordić - The AI Epiphany
Показать
Комментарии отсутствуют
Информация о видео
24 августа 2021 г. 18:48:54
00:15:22
Другие видео канала
Feed-forward method | Neural Style Transfer #510k subscribers | joining Google DeepMind, updates, AMADay 25: Open NLLB - filtering HBS (Pt 2)Day 24: Open NLLB - back from China, analyzing spikes, preparing HBS run (Pt 2)Day 6: Meta NLLB - data filtering (Pt. 4)Day 24: Open NLLB - back from China, filtering HBS data (Pt 3)Day 14: Open NLLB - Eval of our first run (English, Turkish, Hindi) (Pt 2.)The Vesuvius challenge breakthrough with Luke FarritorDay 6: Meta NLLB - data filtering (Pt. 3)DeepMind's Android RL Environment - AndroidEnvHamel Husain - Building LLM Apps in ProductionDay 28: Open NLLB - debugging fuzzy dedup, training fasttext LID (Pt 3)Day 10: Open NLLB - evaluation data, filtering (Pt 3.)Day 18: Open NLLB - data loading document, GitHub tasks (Pt 1 cont.)DeepMind's TacticAI: an AI assistant for football tactics | Petar VeličkovićJarvis for Images! (demo) - run locally, no external APIsDay 15: Open NLLB - ALTI+, detecting hallucinations (Pt 2)Day 29: Open NLLB - training fasttext LID (Pt 1)LLaMA 2 w/ Thomas Scialom (LLaMA 2 lead)Day 19: Open NLLB - compute grant, downloading HBS parallel corpora (Pt 1)