Hugging Face Transformers: the basics. Practical coding guides SE1E1. NLP Models (BERT/RoBERTa)
### Practical Python Coding Guide - BERT in PyTorch
In this first episode of the practical coding guide series, I discuss the basics of the Hugging Face Transformers Library. What is it? how does it work? what can you do with it? This episode focuses on high-level concepts, navigating their website and implementing some out-of-the-box functionality.
Intro: 00:00
What is Hugging Face's Transformer Library: 1:12
Hugging Face models: 2:00
Navigating the Transformers documentation: 8:56
Coding with Transformers - installation: 11:55
Using pre-defined pipelines: 12:45
Implementing a model through PyTorch: 14:08
Tokenisers, Token IDs and Attention Masks: 16:28
Output from the model: 25:26
Outro: 27:26
This series attempts to offer a casual guide to Hugging Face and Transformer models focused on implementation rather than theory. Let me know if you enjoy them!
In future episodes, I will be retraining a model from the Transformers Library (RoBERTa) on a downstream task: a multi-label classification problem. In an attempt to spot subtle sentiment attributes in online comments. Make sure to subscribe if you are interested.
Check out my website: https://www.rupert.digital
----- Good learning material for theory (Transformers / BERT)
Attention is all you need paper: https://arxiv.org/abs/1706.03762
BERT paper: https://arxiv.org/abs/1810.04805
RoBERTa paper: https://arxiv.org/abs/1907.11692
Jay Alanmar illustrated articles: https://jalammar.github.io/illustrated-transformer/ (check out his BERT one too)
Chris McCormick: https://mccormickml.com/ (check out his youtube series on BERT / Transformers)
Видео Hugging Face Transformers: the basics. Practical coding guides SE1E1. NLP Models (BERT/RoBERTa) канала rupert ai
In this first episode of the practical coding guide series, I discuss the basics of the Hugging Face Transformers Library. What is it? how does it work? what can you do with it? This episode focuses on high-level concepts, navigating their website and implementing some out-of-the-box functionality.
Intro: 00:00
What is Hugging Face's Transformer Library: 1:12
Hugging Face models: 2:00
Navigating the Transformers documentation: 8:56
Coding with Transformers - installation: 11:55
Using pre-defined pipelines: 12:45
Implementing a model through PyTorch: 14:08
Tokenisers, Token IDs and Attention Masks: 16:28
Output from the model: 25:26
Outro: 27:26
This series attempts to offer a casual guide to Hugging Face and Transformer models focused on implementation rather than theory. Let me know if you enjoy them!
In future episodes, I will be retraining a model from the Transformers Library (RoBERTa) on a downstream task: a multi-label classification problem. In an attempt to spot subtle sentiment attributes in online comments. Make sure to subscribe if you are interested.
Check out my website: https://www.rupert.digital
----- Good learning material for theory (Transformers / BERT)
Attention is all you need paper: https://arxiv.org/abs/1706.03762
BERT paper: https://arxiv.org/abs/1810.04805
RoBERTa paper: https://arxiv.org/abs/1907.11692
Jay Alanmar illustrated articles: https://jalammar.github.io/illustrated-transformer/ (check out his BERT one too)
Chris McCormick: https://mccormickml.com/ (check out his youtube series on BERT / Transformers)
Видео Hugging Face Transformers: the basics. Practical coding guides SE1E1. NLP Models (BERT/RoBERTa) канала rupert ai
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
![The U-Net (actually) explained in 10 minutes](https://i.ytimg.com/vi/NhdzGfB1q74/default.jpg)
![ResNet (actually) explained in under 10 minutes](https://i.ytimg.com/vi/o_3mboe1jYI/default.jpg)
![London ML Researcher @ CVPR 2022, New Orleans. Highlights! (ft. Telsa Cybertruck & Dream Fields)](https://i.ytimg.com/vi/rIJsVPeoWVo/default.jpg)
![Implementing a multi-class CNN Image Classifier in Pytorch! Computer Vision Basics Ep. 3 CIFAR10 CNN](https://i.ytimg.com/vi/iG8B7x_prLQ/default.jpg)
![Understanding Convolution Kernels for CNNs! Computer Vision Basics Ep. 2 What are convolutions?](https://i.ytimg.com/vi/6VP9k2WM6k0/default.jpg)
![Image Classification with CIFAR 10! Computer Vision Basics Ep. 1 Loading Data (coding follow-along)](https://i.ytimg.com/vi/eIuR_y4t1QA/default.jpg)
![Masked Language Modelling Part 2 - Retraining BERT w/ Hugging Face Trainer - MRSCC - Coding Tutorial](https://i.ytimg.com/vi/fxlr-OKzCos/default.jpg)
![Masked Language Modelling with Hugging Face - Microsoft Sentence Completion - Coding Tutorial](https://i.ytimg.com/vi/f2Zg997nw8g/default.jpg)
![Multi-Label Classification on Unhealthy Comments - Finetuning RoBERTa with PyTorch - Coding Tutorial](https://i.ytimg.com/vi/vNKIg8rXK6w/default.jpg)