Загрузка страницы

Solving NLP Problems with BERT, N-gram, Embedding, LSTM, GRU, Self-Attention, Transformer 🔥🔥🔥🔥

Join the channel membership:
https://www.youtube.com/c/AIPursuit/join

Subscribe to the channel:
https://www.youtube.com/c/AIPursuit?sub_confirmation=1

Support and Donation:
Paypal ⇢ https://paypal.me/tayhengee
Patreon ⇢ https://www.patreon.com/hengee
BTC ⇢ bc1q2r7eymlf20576alvcmryn28tgrvxqw5r30cmpu
ETH ⇢ 0x58c4bD4244686F3b4e636EfeBD159258A5513744
Doge ⇢ DSGNbzuS1s6x81ZSbSHHV5uGDxJXePeyKy

Earn up to $170 welcome bonus on Huobi now with my crypto affiliate link:
Binance ⇢ https://accounts.binance.com/en/register?ref=27700065
Huobi ⇢ https://www.huobi.com/en-us/topic/welcome-bonus/?invite_code=xj9pc

The video was published under the license of the Creative Commons Attribution license (reuse allowed) and is reposted for educational purposes.
Source: https://youtu.be/MtP-UAyVuZY

In this talk Álvaro will introduce the concept of language models, and review some of the state of the art approaches to building such models (BERT, GPT-2 and XLNet), delving into the network architecture and training strategies used in them. Then he will move on to show how these pre-trained language models can be fine-tuned to small datasets to produce high quality results in downstream NLP tasks, by making use of the open-source PyTorch-Transformers library (https://github.com/huggingface/transformers). This library is built on top of the PyTorch deep learning framework, and allows loading pre-trained language models and fine-tuning them easily. This talk will be focused on the theoretical grounds of these methods and on his practical experience in applying them.

Видео Solving NLP Problems with BERT, N-gram, Embedding, LSTM, GRU, Self-Attention, Transformer 🔥🔥🔥🔥 канала AIP - Pursuing SoTA AI for everyone
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
17 июля 2020 г. 21:00:23
00:40:50
Яндекс.Метрика