Solving NLP Problems with BERT, N-gram, Embedding, LSTM, GRU, Self-Attention, Transformer 🔥🔥🔥🔥
Join the channel membership:
https://www.youtube.com/c/AIPursuit/join
Subscribe to the channel:
https://www.youtube.com/c/AIPursuit?sub_confirmation=1
Support and Donation:
Paypal ⇢ https://paypal.me/tayhengee
Patreon ⇢ https://www.patreon.com/hengee
BTC ⇢ bc1q2r7eymlf20576alvcmryn28tgrvxqw5r30cmpu
ETH ⇢ 0x58c4bD4244686F3b4e636EfeBD159258A5513744
Doge ⇢ DSGNbzuS1s6x81ZSbSHHV5uGDxJXePeyKy
Earn up to $170 welcome bonus on Huobi now with my crypto affiliate link:
Binance ⇢ https://accounts.binance.com/en/register?ref=27700065
Huobi ⇢ https://www.huobi.com/en-us/topic/welcome-bonus/?invite_code=xj9pc
The video was published under the license of the Creative Commons Attribution license (reuse allowed) and is reposted for educational purposes.
Source: https://youtu.be/MtP-UAyVuZY
In this talk Álvaro will introduce the concept of language models, and review some of the state of the art approaches to building such models (BERT, GPT-2 and XLNet), delving into the network architecture and training strategies used in them. Then he will move on to show how these pre-trained language models can be fine-tuned to small datasets to produce high quality results in downstream NLP tasks, by making use of the open-source PyTorch-Transformers library (https://github.com/huggingface/transformers). This library is built on top of the PyTorch deep learning framework, and allows loading pre-trained language models and fine-tuning them easily. This talk will be focused on the theoretical grounds of these methods and on his practical experience in applying them.
Видео Solving NLP Problems with BERT, N-gram, Embedding, LSTM, GRU, Self-Attention, Transformer 🔥🔥🔥🔥 канала AIP - Pursuing SoTA AI for everyone
https://www.youtube.com/c/AIPursuit/join
Subscribe to the channel:
https://www.youtube.com/c/AIPursuit?sub_confirmation=1
Support and Donation:
Paypal ⇢ https://paypal.me/tayhengee
Patreon ⇢ https://www.patreon.com/hengee
BTC ⇢ bc1q2r7eymlf20576alvcmryn28tgrvxqw5r30cmpu
ETH ⇢ 0x58c4bD4244686F3b4e636EfeBD159258A5513744
Doge ⇢ DSGNbzuS1s6x81ZSbSHHV5uGDxJXePeyKy
Earn up to $170 welcome bonus on Huobi now with my crypto affiliate link:
Binance ⇢ https://accounts.binance.com/en/register?ref=27700065
Huobi ⇢ https://www.huobi.com/en-us/topic/welcome-bonus/?invite_code=xj9pc
The video was published under the license of the Creative Commons Attribution license (reuse allowed) and is reposted for educational purposes.
Source: https://youtu.be/MtP-UAyVuZY
In this talk Álvaro will introduce the concept of language models, and review some of the state of the art approaches to building such models (BERT, GPT-2 and XLNet), delving into the network architecture and training strategies used in them. Then he will move on to show how these pre-trained language models can be fine-tuned to small datasets to produce high quality results in downstream NLP tasks, by making use of the open-source PyTorch-Transformers library (https://github.com/huggingface/transformers). This library is built on top of the PyTorch deep learning framework, and allows loading pre-trained language models and fine-tuning them easily. This talk will be focused on the theoretical grounds of these methods and on his practical experience in applying them.
Видео Solving NLP Problems with BERT, N-gram, Embedding, LSTM, GRU, Self-Attention, Transformer 🔥🔥🔥🔥 канала AIP - Pursuing SoTA AI for everyone
Показать
Комментарии отсутствуют
Информация о видео
17 июля 2020 г. 21:00:23
00:40:50
Другие видео канала
LSTM is dead. Long Live Transformers!Sending BERT to Med School – Injecting Medical Knowledge into BERT | Healthcare NLP Summit 2021Predicting the Future of the Web Development (2020 and 2025)Text Preprocessing | Sentiment Analysis with BERT using huggingface, PyTorch and Python TutorialAn introduction to Reinforcement LearningRasa Algorithm Whiteboard - Transformers & Attention 1: Self AttentionBERT: Pre-training of Deep Bidirectional Transformers for Language UnderstandingMIT 6.S191: Recurrent Neural NetworksIt's Rocket Science! with Professor Chris BishopFive Natural Language Processing Research Trends to Watch in 2021Recent Breakthroughs and Uphill Battles in Modern Natural Language ProcessingMaking the best NLU with Rasa and BERT, Rasa Developer Summit 2019Mathematics is the queen of SciencesNatural Language Processing In 10 Minutes | NLP Tutorial For Beginners | NLP Training | SimplilearnA.I. teaches itself to drive in TrackmaniaRoBERTa: A Robustly Optimized BERT Pretraining ApproachBreak into NLP hosted by deeplearning.aiNLP is going to be the most transformational tech of the decade! | NLP 2020Transformer Neural Networks - EXPLAINED! (Attention is all you need)NLP for Developers: BERT | Rasa