High-Level Tutorial of OpenAI's GPT-3 | GPT-3 VS BERT Family of NLP Models
GPT-3 is the biggest language ever model built, and it has been attracting a lot of attention. Rather than argue about whether GPT-3 is overhyped or not, we wanted to dig in to the literature and understand what GPT-3 is (and is not) in light of it’s predecessors and alternative transformer models. In this video we share some of what we’ve learned. What is GPT-3 really good at? What are its constraints? How useful is it for business? Enjoy!
⏰ Time Stamps ⏰
00:40 - Comparison of latest Natural Language Processing Models
01:09 - What is a Transformer Model
01:50 - The Two Types of Transformer Models
02:15 - Difference between bi-directional encoders (BERT) and autoregressive decoders (GPT)
04:40 - GPT-3 is HUGE, does size matter?
05:24 - Presentation of size differences between GPT-3 relative to BERT, RoBERTa, GPT-2, and T5
07:40 - What does GPT do and how is it different than the BERT family?
18:05 - Is GPT-3 a Child Prodigy or a Parlor Trick?
18:44 - Back to the Issue of GPT-3's size
19:30 - Final thoughts on GPT-3 vs BERT
Видео High-Level Tutorial of OpenAI's GPT-3 | GPT-3 VS BERT Family of NLP Models канала Prolego
⏰ Time Stamps ⏰
00:40 - Comparison of latest Natural Language Processing Models
01:09 - What is a Transformer Model
01:50 - The Two Types of Transformer Models
02:15 - Difference between bi-directional encoders (BERT) and autoregressive decoders (GPT)
04:40 - GPT-3 is HUGE, does size matter?
05:24 - Presentation of size differences between GPT-3 relative to BERT, RoBERTa, GPT-2, and T5
07:40 - What does GPT do and how is it different than the BERT family?
18:05 - Is GPT-3 a Child Prodigy or a Parlor Trick?
18:44 - Back to the Issue of GPT-3's size
19:30 - Final thoughts on GPT-3 vs BERT
Видео High-Level Tutorial of OpenAI's GPT-3 | GPT-3 VS BERT Family of NLP Models канала Prolego
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
#031 WE GOT ACCESS TO GPT-3! (With Gary Marcus, Walid Saba and Connor Leahy)BERT Neural Network - EXPLAINED!Transformers, explained: Understand the model behind GPT, BERT, and T5OpenAI GPT-3 - Good At Almost Everything! 🤖GPT 3 Demo | How to Use OpenAI’s GPT 3 in Marketing | 4 Cool Use CasesHow GPT3 Works - Easily Explained with AnimationsPrompt Engineering: Prompt based learning in NLPThis New A.I. Can Write Anything, Even Code (GPT-3)GPT-3: Language Models are Few-Shot Learners (Paper Explained)Lecture 12.3 Famous transformers (BERT, GPT-2, GPT-3)Set-up a custom BERT Tokenizer for any language10 OpenAI GPT-3 Demos and Examples to Convince You that AI is RealGPT3: An Even Bigger Language Model - ComputerphileThe danger of AI is weirder than you think | Janelle ShaneWhat is BERT? | Deep Learning Tutorial 46 (Tensorflow, Keras & Python)DeBERTa: Decoding-enhanced BERT with Disentangled Attention (Machine Learning Paper Explained)Multilingual BERT - Part 1 - Intro and ConceptsGPT-3 - Language Models are Few-Shot Learners | Paper ExplainedSolving NLP Problems with #BERT, N-gram, #Embedding, #LSTM, #GRU, #Self-Attention, #Transformer 🔥🔥🔥🔥GPT 3 Tutorial | GPT 3 Explained | What Is GPT 3(Generative Pre-trained Transformer 3)? |Simplilearn