Should you switch from BERT to ALBERT?
From the summaries you’ll find online, it sounds like ALBERT is both faster and more accurate than BERT--so we should probably switch over to ALBERT as our new go-to model for fine-tuning, right? In reality, if your plan is to try applying a pre-trained BERT model to your own NLP application, then you probably won’t find ALBERT to be any faster or more accurate...
In this video, we’ll make sense of this apparent contradiction, and I’ll cover the key take-aways from ALBERT, which is still a very important and fascinating model!
For a complete tutorial on ALBERT, I’ve published a ~25-page eBook here:
https://www.chrismccormick.ai/offers/HaABTJQH
Видео Should you switch from BERT to ALBERT? канала ChrisMcCormickAI
In this video, we’ll make sense of this apparent contradiction, and I’ll cover the key take-aways from ALBERT, which is still a very important and fascinating model!
For a complete tutorial on ALBERT, I’ve published a ~25-page eBook here:
https://www.chrismccormick.ai/offers/HaABTJQH
Видео Should you switch from BERT to ALBERT? канала ChrisMcCormickAI
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
![Applying BERT to Question Answering (SQuAD v1.1)](https://i.ytimg.com/vi/l8ZYCvgGu0o/default.jpg)
![RoBERTa: A Robustly Optimized BERT Pretraining Approach](https://i.ytimg.com/vi/-MCYbmU9kfg/default.jpg)
![The first 20 hours -- how to learn anything | Josh Kaufman | TEDxCSU](https://i.ytimg.com/vi/5MgBikgcWnY/default.jpg)
![ALBERT: A Lite BERT For Self-supervised Learning Of Language Representations (AI Paper Summary)](https://i.ytimg.com/vi/N-BBFcAN3L0/default.jpg)
![BERT Research - Ep. 1 - Key Concepts & Sources](https://i.ytimg.com/vi/FKlPCK1uFrc/default.jpg)
![Training BERT Language Model From Scratch On TPUs](https://i.ytimg.com/vi/s-3zts7FTDA/default.jpg)
![LSTM is dead. Long Live Transformers!](https://i.ytimg.com/vi/S27pHKBEp30/default.jpg)
![Trivial BERsuiT - How much trivia does BERT know?](https://i.ytimg.com/vi/E7GzE7FYtbE/default.jpg)
![Topic Modeling with BERT Transformers - EXPLAINED!](https://i.ytimg.com/vi/TLPmlVeEf1k/default.jpg)
![How to learn any language easily | Matthew Youlden | TEDxClapham](https://i.ytimg.com/vi/Yr_poW-KK1Q/default.jpg)
![Transformer Neural Networks - EXPLAINED! (Attention is all you need)](https://i.ytimg.com/vi/TQQlZhbC5ps/default.jpg)
![Data Processing For Question & Answering Systems: BERT vs. RoBERTa](https://i.ytimg.com/vi/6a6L_9USZxg/default.jpg)
![NLP with Neural Networks & Transformers](https://i.ytimg.com/vi/BGKumht1qLA/default.jpg)
![An introduction to Reinforcement Learning](https://i.ytimg.com/vi/JgvyzIkgxF0/default.jpg)
![Text Extraction From a Corpus Using BERT (AKA Question Answering)](https://i.ytimg.com/vi/XaQ0CBlQ4cY/default.jpg)
![BERT Neural Network - EXPLAINED!](https://i.ytimg.com/vi/xI0HHN5XKDo/default.jpg)
![BERT Document Classification Tutorial with Code](https://i.ytimg.com/vi/_eSGWNqKeeY/default.jpg)
![Attention is all you need; Attentional Neural Network Models | Łukasz Kaiser | Masterclass](https://i.ytimg.com/vi/rBCqOTEfxvg/default.jpg)
![Subword Tokenization: Byte Pair Encoding](https://i.ytimg.com/vi/zjaRNfvNMTs/default.jpg)
![NLP | Fine Tuning BERT to perform Spam Classification](https://i.ytimg.com/vi/mw7ay38--ak/default.jpg)