Загрузка страницы

Unlock the Power of Self-Attention in Python: A Beginner-Friendly Guide!

In this video, we'll be diving into the world of self-attention in Python. We will dive into the code and implement a simple self-attention mechanism using the popular NumPy library. The Transformer was proposed in the paper Attention is All You Need. By the end of this video, you'll have a solid understanding of how to use self-attention in your own projects and how it can improve your models' performance. Whether you're a beginner or an experienced developer, this video is for you! A self-attention module takes in n inputs and returns n outputs. In Simple terms, the self-attention mechanism allows the inputs to interact with each other (“self”) and find out who they should pay more attention to (“attention”). The outputs are aggregates of these interactions and attention scores.

▶ Link to the notebook : https://github.com/bhattbhavesh91/self-attention-python

Self-Attention compares all input sequence members with each other, and modifies the corresponding output sequence positions. In other words, self-attention layer differentiably key-value searches the input sequence for each inputs, and adds results to the output sequence. Self attention is very commonly used in deep learning these days. For example, it is one of the main building blocks of the Transformer paper (Attention is all you need) which is fast becoming the go to deep learning architectures for several problems both in computer vision and language processing. Additionally, all these famous papers like BERT, GPT, XLM, Performer use some variation of the transformers which in turn is built using self-attention layers as building blocks.
▶ Sponsor me on GitHub : https://github.com/sponsors/bhattbhavesh91/
▶ Join this channel to get access to perks: https://bit.ly/BhaveshBhattJoin
▶ Join the Telegram channel for regular updates: https://t.me/bhattbhavesh91
▶ If you like my work, you can buy me a coffee : https://bit.ly/BuyBhaveshCoffee

*I use affiliate links on the products that I recommend. These give me a small portion of the sales price at no cost to you. I appreciate the proceeds and they help me to improve my channel!

▶ Best Book for Python : https://amzn.to/3qYThqu
▶ Best Book for PyTorch & Machine Learning : https://amzn.to/3PyUkdy
▶ Best Book for Statistics : https://amzn.to/3vzvHEn
▶ Best Book for BERT: https://amzn.to/3lpX0fz
▶ Best Book for Machine Learning : https://amzn.to/2P6aZuT
▶ Best Book for Deep Learning : https://amzn.to/30UMTGl
▶ Best Intro Book for MLOps : https://amzn.to/3AoPZmM

Equipments I use for recording the videos:
▶ 1st Laptop I use : https://amzn.to/3AqI8Fp
▶ 2nd Laptop I use : https://amzn.to/3KAiYsB
▶ Microphone : https://amzn.to/3qUPxtz
▶ Camera : https://amzn.to/3rKQsM2
▶ Mobile Phone : https://amzn.to/3nRHP1f
▶ Ring Light : https://amzn.to/33LedM5
▶ RGB Light : https://amzn.to/3KzLgmS
▶ Bag I use : https://amzn.to/3AsM3RZ

If you do have any questions with what we covered in this video then feel free to ask in the comment section below & I'll do my best to answer those.

If you enjoy these tutorials & would like to support them then the easiest way is to simply like the video & give it a thumbs up & also it's a huge help to share these videos with anyone who you think would find them useful.

Please consider clicking the SUBSCRIBE button to be notified for future videos & thank you all for watching.

You can find me on:
▶ Blog - https://bhattbhavesh91.github.io
▶ Twitter - https://twitter.com/_bhaveshbhatt
▶ GitHub - https://github.com/bhattbhavesh91
▶ Medium - https://medium.com/@bhattbhavesh91
▶ About.me - https://about.me/bhattbhavesh91
▶ Linktree - https://linktr.ee/bhattbhavesh91
▶ DEV Community - https://dev.to/bhattbhavesh91
▶ Telegram - https://t.me/bhattbhavesh91

#attention #naturallanguageprocessing #deeplearning

Видео Unlock the Power of Self-Attention in Python: A Beginner-Friendly Guide! канала Bhavesh Bhatt
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
19 января 2023 г. 16:45:02
00:24:56
Яндекс.Метрика