Transformers EXPLAINED! Neural Networks | | Encoder | Decoder | Attention
===== Likes: 16 👍: Dislikes: 0 👎: 100.0% : Updated on 01-21-2023 11:57:17 EST =====
Transformers Explained! This architecture came from this amazing paper: Attention is all you need. Link here: https://arxiv.org/abs/1706.03762
Parts of this architecture is used for state-of-the-art technologies such as GPT-3 and variations of BERT. So, you must know what a transformer model is if you want to dive further into the more advanced methods since they all build upon the principles of the transformer model!
I explain what you NEED to know and nothing more!
Feel free to support me! Do know that just viewing my content is plenty of support! 😍
☕Consider supporting me! https://ko-fi.com/spencerpao ☕
Watch Next?
Named Entity Recognition → https://youtu.be/4pCB1lZrBcQ
Text Cleaning and Preprocessing → https://youtu.be/ZucclQNVBlo
🔗 My Links 🔗
Github: https://github.com/SpencerPao/spencerpao.github.io
My Website: https://spencerpao.github.io/
Notebook: https://github.com/SpencerPao/Natural-Language-Processing/tree/main/Transformers
📓 Requirements 🧐
Python
Jupyter notebok
⌛ Timeline ⌛
0:00 - What is a Transformer? Additional Resources
1:41 - Why use a Transformer architecture?
3:17 - Encoder Block
7:52 - Decoder Block
10:20 - Multi-head Attention Mechanisms Explained Further
🏷️Tags🏷️:
Python,Transformers,Transformer,Machine Learning, Deep Learning, Encoder, Decoder, Attention, multi-head attention, embedding, Natural Language Processing, natural, language, processing, word, token, Normalization,Positional encoding,layers,softmax,output,probabilities,feed-forward network,network,tutorial,how to,instruction,gpt-3,building block,BERT
🔔Current Subs🔔:
2,906
Видео Transformers EXPLAINED! Neural Networks | | Encoder | Decoder | Attention канала Spencer Pao
Transformers Explained! This architecture came from this amazing paper: Attention is all you need. Link here: https://arxiv.org/abs/1706.03762
Parts of this architecture is used for state-of-the-art technologies such as GPT-3 and variations of BERT. So, you must know what a transformer model is if you want to dive further into the more advanced methods since they all build upon the principles of the transformer model!
I explain what you NEED to know and nothing more!
Feel free to support me! Do know that just viewing my content is plenty of support! 😍
☕Consider supporting me! https://ko-fi.com/spencerpao ☕
Watch Next?
Named Entity Recognition → https://youtu.be/4pCB1lZrBcQ
Text Cleaning and Preprocessing → https://youtu.be/ZucclQNVBlo
🔗 My Links 🔗
Github: https://github.com/SpencerPao/spencerpao.github.io
My Website: https://spencerpao.github.io/
Notebook: https://github.com/SpencerPao/Natural-Language-Processing/tree/main/Transformers
📓 Requirements 🧐
Python
Jupyter notebok
⌛ Timeline ⌛
0:00 - What is a Transformer? Additional Resources
1:41 - Why use a Transformer architecture?
3:17 - Encoder Block
7:52 - Decoder Block
10:20 - Multi-head Attention Mechanisms Explained Further
🏷️Tags🏷️:
Python,Transformers,Transformer,Machine Learning, Deep Learning, Encoder, Decoder, Attention, multi-head attention, embedding, Natural Language Processing, natural, language, processing, word, token, Normalization,Positional encoding,layers,softmax,output,probabilities,feed-forward network,network,tutorial,how to,instruction,gpt-3,building block,BERT
🔔Current Subs🔔:
2,906
Видео Transformers EXPLAINED! Neural Networks | | Encoder | Decoder | Attention канала Spencer Pao
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
![How to USE Named Entity Recognition (NER) Models | NLP | Text Categorization | SpaCy](https://i.ytimg.com/vi/4pCB1lZrBcQ/default.jpg)
![Introduction to Selenium via Python!](https://i.ytimg.com/vi/vmUfE_ExR_o/default.jpg)
![AI vs Chrome Dinosaur Game | Computer Vision | FPS | Data Engineering | Screenshots | Live Coding](https://i.ytimg.com/vi/6iekqFLAxl0/default.jpg)
![HOW TO: Multiple Linear Regression --Modeling-- Part 2](https://i.ytimg.com/vi/UvaivsJiHBQ/default.jpg)
![Behavioral-Technical Data Science Interview | College Level Questions](https://i.ytimg.com/vi/hc90nMNfQXA/default.jpg)
![Saving and Loading Machine Learning Models (Transfer Learning!)](https://i.ytimg.com/vi/6pw9IDFxWFM/default.jpg)
![Advance Your Visualizations in R](https://i.ytimg.com/vi/l89E74X8MTg/default.jpg)
![Deep Q Networks | Q Learning | Reinforcement Learning | Epsilon-Greedy Policy | Python | AI Gym](https://i.ytimg.com/vi/nCgd9lrmYwE/default.jpg)
![Popular Methods to Cleaning Data in Python | Pandas](https://i.ytimg.com/vi/tQqKQOsjEw8/default.jpg)
![How to Configure Selenium](https://i.ytimg.com/vi/Xiin7Rgl0oE/default.jpg)
![Applying Random Forests and Decision Trees in R pt. 1](https://i.ytimg.com/vi/7048AD5Bsmk/default.jpg)
![ML/AI Machine Learns to Paint! #shorts](https://i.ytimg.com/vi/-UJJm_VMSdU/default.jpg)
![Teaching AI to beat the Dinosaur Game | Python | Computer Vision | Canny Edge Detection](https://i.ytimg.com/vi/K9XMAnwO7wM/default.jpg)
![NLP Text Cleaning and Preprocessing | Tokenization | Lemmatization | Sententizer | Paragraphizer](https://i.ytimg.com/vi/ZucclQNVBlo/default.jpg)
![Applied Lasso Regression in R](https://i.ytimg.com/vi/p5q4F1ZNyzw/default.jpg)
![Understanding and Applying BERT | Bidirectional Encoder Representations from Transformers | NLP | Py](https://i.ytimg.com/vi/72Ylk77PqR8/default.jpg)
![NLP Sentiment Analysis in Python](https://i.ytimg.com/vi/CzRrD76pnVY/default.jpg)
![Ensemble Method: Bagging (Bootstrap aggregating)](https://i.ytimg.com/vi/jICEo3o2fys/default.jpg)
![Automating the Display of Dislikes BACK on my Channel](https://i.ytimg.com/vi/g9x_Eg5G-LI/default.jpg)
![Basics of Regex (Regular Expressions)](https://i.ytimg.com/vi/cLoH112nqGc/default.jpg)