Ivan Bilan: Understanding and Applying Self-Attention for NLP | PyData Berlin 2018
PyData Berlin 2018
Understanding attention mechanisms and self-attention, presented in Google's "Attention is all you need" paper, is a beneficial skill for anyone who works on complex NLP problems. In this talk, we will go over the main parts of the Google Transformer self-attention model and the intuition behind it. Then we will look on how this architecture can be used for other NLP tasks, i.e. slot filling.
Slides: https://www.dropbox.com/s/hri8veio4rep5g4/Self-Attention_for_NLP_by_Ivan_Bilan.pptx?dl=0
---
www.pydata.org
PyData is an educational program of NumFOCUS, a 501(c)3 non-profit organization in the United States. PyData provides a forum for the international community of users and developers of data analysis tools to share ideas and learn from each other. The global PyData network promotes discussion of best practices, new approaches, and emerging technologies for data management, processing, analytics, and visualization. PyData communities approach data science using many languages, including (but not limited to) Python, Julia, and R.
PyData conferences aim to be accessible and community-driven, with novice to advanced level presentations. PyData tutorials and talks bring attendees the latest project features along with cutting-edge use cases.
Видео Ivan Bilan: Understanding and Applying Self-Attention for NLP | PyData Berlin 2018 канала PyData
Understanding attention mechanisms and self-attention, presented in Google's "Attention is all you need" paper, is a beneficial skill for anyone who works on complex NLP problems. In this talk, we will go over the main parts of the Google Transformer self-attention model and the intuition behind it. Then we will look on how this architecture can be used for other NLP tasks, i.e. slot filling.
Slides: https://www.dropbox.com/s/hri8veio4rep5g4/Self-Attention_for_NLP_by_Ivan_Bilan.pptx?dl=0
---
www.pydata.org
PyData is an educational program of NumFOCUS, a 501(c)3 non-profit organization in the United States. PyData provides a forum for the international community of users and developers of data analysis tools to share ideas and learn from each other. The global PyData network promotes discussion of best practices, new approaches, and emerging technologies for data management, processing, analytics, and visualization. PyData communities approach data science using many languages, including (but not limited to) Python, Julia, and R.
PyData conferences aim to be accessible and community-driven, with novice to advanced level presentations. PyData tutorials and talks bring attendees the latest project features along with cutting-edge use cases.
Видео Ivan Bilan: Understanding and Applying Self-Attention for NLP | PyData Berlin 2018 канала PyData
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
Deep Learning 7. Attention and Memory in Deep LearningRasa Algorithm Whiteboard - Transformers & Attention 1: Self AttentionLanguage Learning with BERT - TensorFlow and Deep Learning SingaporeFrom Research to Production with PyTorchAn Absolute Beginner's Guide to Deep Learning with Keras | Dr. Brian Spiering @ PyBay2018Thomas Wolf (HuggingFace): An Introduction to Transfer Learning and HuggingFaceAnthony Shaw - Wily Python: Writing simpler and more maintainable Python - PyCon 2019Neuro Linguistic Programming Techniques You Can Use InstantlyExploring the Limits of Transfer Learning with a Unified Text-to-Text TransformerDeep Learning pour le traitement du Langage avec Pytorch (S. Collet)Vincent D. Warmerdam: Untitled12.ipynb | PyData Eindhoven 2019Building an entity extraction model using BERTVision Transformer - Keras Code Examples!!Jake VanderPlas - Performance Python: Seven Strategies for Optimizing Your Numerical Code[BERT] Pretranied Deep Bidirectional Transformers for Language Understanding (algorithm) | TDLSKopriva Science Seminar Series, Robert Lustig, M.D., M.S.L., March 8, 2018Vectoring Words (Word Embeddings) - ComputerphileSelf-Attention Modeling for Visual Recognition, by Han HuAttention in Neural NetworksMIT 6.S191 (2020): Deep Generative Modeling