GPT-3 - Language Models are Few-Shot Learners | Paper Explained
❤️ Become The AI Epiphany Patreon ❤️ ► https://www.patreon.com/theaiepiphany
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
In this video, I cover the famous GPT-3 model.
I first give you some context about the stuff that happened since the paper was first published in May 2020 (hype, anti-hype, limitations, and cool apps), and then I dive deep into explaining the paper.
You'll learn about:
✔️ Useful resources on GPT-3
✔️ Main takeaways from the paper
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
✅ "anti-hype" blog: https://lacker.io/ai/2020/07/06/giving-gpt-3-a-turing-test.html
✅ Gwern's blog: https://www.gwern.net/GPT-3
✅ My transformer implementation: https://github.com/gordicaleksa/pytorch-original-transformer
✅ Cool "GPT game": https://play.aidungeon.io/
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
⌚️ Timetable:
00:00 GPT (anti)hype, Gwern, prompt programming
04:30 Abstract of the paper
06:50 Architecture, data, compute
12:15 Zero-shot, one-shot, and few-shot learning
18:45 Power-law chart (more compute please)
20:35 Results (machine translation)
23:05 NLI (reasoning is hard)
24:40 Arithmetic
26:25 Word unscrambling
28:40 SAT analogies (how smart are humans?)
30:45 Fake news generation
32:05 Data contamination
35:05 Limitations of the model
37:35 Bias, fairness (broader impact)
44:30 Final thoughts, are we going towards an AGI?
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
💰 BECOME A PATREON OF THE AI EPIPHANY ❤️
If these videos, GitHub projects, and blogs help you,
consider helping me out by supporting me on Patreon!
The AI Epiphany ► https://www.patreon.com/theaiepiphany
One-time donation:
https://www.paypal.com/paypalme/theaiepiphany
Much love! ❤️
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
💡 The AI Epiphany is a channel dedicated to simplifying the field of AI using creative visualizations and in general, a stronger focus on geometrical and visual intuition, rather than the algebraic and numerical "intuition".
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
👋 CONNECT WITH ME ON SOCIAL
LinkedIn ► https://www.linkedin.com/in/aleksagordic/
Twitter ► https://twitter.com/gordic_aleksa
Instagram ► https://www.instagram.com/aiepiphany/
Facebook ► https://www.facebook.com/aiepiphany/
👨👩👧👦 JOIN OUR DISCORD COMMUNITY:
Discord ► https://discord.gg/peBrCpheKE
📢 SUBSCRIBE TO MY MONTHLY AI NEWSLETTER:
Substack ► https://aiepiphany.substack.com/
💻 FOLLOW ME ON GITHUB FOR COOL PROJECTS:
GitHub ► https://github.com/gordicaleksa
📚 FOLLOW ME ON MEDIUM:
Medium ► https://gordicaleksa.medium.com/
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
#gpt3 #fewshotlearning #deeplearning
Видео GPT-3 - Language Models are Few-Shot Learners | Paper Explained канала The AI Epiphany
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
In this video, I cover the famous GPT-3 model.
I first give you some context about the stuff that happened since the paper was first published in May 2020 (hype, anti-hype, limitations, and cool apps), and then I dive deep into explaining the paper.
You'll learn about:
✔️ Useful resources on GPT-3
✔️ Main takeaways from the paper
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
✅ "anti-hype" blog: https://lacker.io/ai/2020/07/06/giving-gpt-3-a-turing-test.html
✅ Gwern's blog: https://www.gwern.net/GPT-3
✅ My transformer implementation: https://github.com/gordicaleksa/pytorch-original-transformer
✅ Cool "GPT game": https://play.aidungeon.io/
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
⌚️ Timetable:
00:00 GPT (anti)hype, Gwern, prompt programming
04:30 Abstract of the paper
06:50 Architecture, data, compute
12:15 Zero-shot, one-shot, and few-shot learning
18:45 Power-law chart (more compute please)
20:35 Results (machine translation)
23:05 NLI (reasoning is hard)
24:40 Arithmetic
26:25 Word unscrambling
28:40 SAT analogies (how smart are humans?)
30:45 Fake news generation
32:05 Data contamination
35:05 Limitations of the model
37:35 Bias, fairness (broader impact)
44:30 Final thoughts, are we going towards an AGI?
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
💰 BECOME A PATREON OF THE AI EPIPHANY ❤️
If these videos, GitHub projects, and blogs help you,
consider helping me out by supporting me on Patreon!
The AI Epiphany ► https://www.patreon.com/theaiepiphany
One-time donation:
https://www.paypal.com/paypalme/theaiepiphany
Much love! ❤️
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
💡 The AI Epiphany is a channel dedicated to simplifying the field of AI using creative visualizations and in general, a stronger focus on geometrical and visual intuition, rather than the algebraic and numerical "intuition".
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
👋 CONNECT WITH ME ON SOCIAL
LinkedIn ► https://www.linkedin.com/in/aleksagordic/
Twitter ► https://twitter.com/gordic_aleksa
Instagram ► https://www.instagram.com/aiepiphany/
Facebook ► https://www.facebook.com/aiepiphany/
👨👩👧👦 JOIN OUR DISCORD COMMUNITY:
Discord ► https://discord.gg/peBrCpheKE
📢 SUBSCRIBE TO MY MONTHLY AI NEWSLETTER:
Substack ► https://aiepiphany.substack.com/
💻 FOLLOW ME ON GITHUB FOR COOL PROJECTS:
GitHub ► https://github.com/gordicaleksa
📚 FOLLOW ME ON MEDIUM:
Medium ► https://gordicaleksa.medium.com/
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
#gpt3 #fewshotlearning #deeplearning
Видео GPT-3 - Language Models are Few-Shot Learners | Paper Explained канала The AI Epiphany
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
9 Cool Startups Powered by GPT-3RoBERTa: A Robustly Optimized BERT Pretraining ApproachGPT-3: Language Models are Few-Shot Learners (Paper Explained)OpenAI's DALL-E in an INSANELY creative new AI! 😍 (GPT-3 Update 2021)PyTorch or TensorFlow?A GPT-3 for Images? Dall-E is the most impressive AI ever created!What is recently happening with OpenAI GPT-3? 😰 | The A.I. Knowledge Show | Ep 6OpenAI GPT-3 - Good At Almost Everything! 🤖What GPT-3 Means for DevelopersGPT Explained!GPT 3 Proof GenerationWhere is GPT-4? (datacenter location, GPT-3, Microsoft Azure, supercomputer, AI21 Jurassic-X)Text Generation with Transformers (GPT-2) In 10 Lines Of Code[research paper review] GPT-3 : Language Models are Few-Shot LearnersAI Tier List (Alpha Go, GPT-3, Dall-E ...)How does a Transformer work ?OpenAI GPT-3: Language Models are Few-Shot LearnersFew Shot N-gram Generation Using GPT-3How to learn PyTorch? (3 easy steps) | 2021