Загрузка страницы

Hugging Face Transformers: the basics. Practical coding guides SE1E1. NLP Models (BERT/RoBERTa)

### Practical Python Coding Guide - BERT in PyTorch

In this first episode of the practical coding guide series, I discuss the basics of the Hugging Face Transformers Library. What is it? how does it work? what can you do with it? This episode focuses on high-level concepts, navigating their website and implementing some out-of-the-box functionality.

Intro: 00:00
What is Hugging Face's Transformer Library: 1:12
Hugging Face models: 2:00
Navigating the Transformers documentation: 8:56
Coding with Transformers - installation: 11:55
Using pre-defined pipelines: 12:45
Implementing a model through PyTorch: 14:08
Tokenisers, Token IDs and Attention Masks: 16:28
Output from the model: 25:26
Outro: 27:26

This series attempts to offer a casual guide to Hugging Face and Transformer models focused on implementation rather than theory. Let me know if you enjoy them!

In future episodes, I will be retraining a model from the Transformers Library (RoBERTa) on a downstream task: a multi-label classification problem. In an attempt to spot subtle sentiment attributes in online comments. Make sure to subscribe if you are interested.

Check out my website: https://www.rupert.digital

----- Good learning material for theory (Transformers / BERT)
Attention is all you need paper: https://arxiv.org/abs/1706.03762
BERT paper: https://arxiv.org/abs/1810.04805
RoBERTa paper: https://arxiv.org/abs/1907.11692
Jay Alanmar illustrated articles: https://jalammar.github.io/illustrated-transformer/ (check out his BERT one too)
Chris McCormick: https://mccormickml.com/ (check out his youtube series on BERT / Transformers)

Видео Hugging Face Transformers: the basics. Practical coding guides SE1E1. NLP Models (BERT/RoBERTa) канала rupert ai
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
15 сентября 2021 г. 21:30:02
00:29:53
Яндекс.Метрика