Making the best NLU with Rasa and BERT, Rasa Developer Summit 2019
Mady Mantha, AI Platform Leader at Sirius Computer Solutions, shares how to build highly performant NLP by integrating BERT with a custom NLU pipeline.
Bidirectional Encoder Representations from Transformers (BERT) is a NLP pre-training technique released by Google. BERT's key innovation is its ability to pre-train bidirectional, contextual language representations modeled on a large text corpus. The model can then be used for downstream NLP tasks like Natural Language Understanding (NLU) and question answering. Named Entity Recognition (NER) is a subtask of NLU that attempts to identify and classify entities in a given text into pre-defined categories like names, places, organizations, currency, and quantities. A NER model can be trained using BERT. Integration of BERT NER with Rasa using a custom pipeline resulted in highly performant NLP and engaging conversations between humans and Rasa agents.
Видео Making the best NLU with Rasa and BERT, Rasa Developer Summit 2019 канала Rasa
Bidirectional Encoder Representations from Transformers (BERT) is a NLP pre-training technique released by Google. BERT's key innovation is its ability to pre-train bidirectional, contextual language representations modeled on a large text corpus. The model can then be used for downstream NLP tasks like Natural Language Understanding (NLU) and question answering. Named Entity Recognition (NER) is a subtask of NLU that attempts to identify and classify entities in a given text into pre-defined categories like names, places, organizations, currency, and quantities. A NER model can be trained using BERT. Integration of BERT NER with Rasa using a custom pipeline resulted in highly performant NLP and engaging conversations between humans and Rasa agents.
Видео Making the best NLU with Rasa and BERT, Rasa Developer Summit 2019 канала Rasa
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
NLP for Developers: Transformers | RasaRasa Algorithm Whiteboard - Diet Architecture 3: BenchmarkingDeep Learning Frameworks 2019Building a chatbot with Rasa NLU and Rasa Core(Ep #4 - Rasa Masterclass) Training the NLU models: understanding pipeline componentsBuilding an entity extraction model using BERTTom Bocklisch - Conversational AI: Building clever chatbotsWebinar: AI Assistants for a Post-Covid World(Ep #8 - Rasa Masterclass)Implementing custom actions with backend integrations, forms and fallbackInteractive Learning of Task-Oriented Dialog Systems, Rasa Developer Summit 2019NLSea - Text Embedding with BERT & BERT Fine TuningUsing spaCy with Bert | Hugging Face Transformers | Matthew Honnibal(Ep #6 - Rasa Masterclass) Domain, custom actions and slotsNLP for Developers: BERT | Rasa(Ep #2 - Rasa Masterclass) Creating the NLU training dataRasa Algorithm Whiteboard - Response SelectionRobust NLU for Voice Assistants: How to understand your users despite your ASR’s bad hearingNLP for Developers: GPT-3 | RasaText Extraction From a Corpus Using BERT (AKA Question Answering)BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding