From Paper to Product – How we implemented BERT | Christoph Henkelmann
BERT is a state-of-the-art natural language processing (NLP) model that allows pretraining on unlabelled text data and later transfer training to a variety of NLP tasks. Due to its promising novel ideas and impressive performance we chose it as a core component for a new natural language generation product. Reading a paper, maybe following a tutorial with example code and putting a working piece of software into production are, however, two totally different things.
In this session, we will tell you how we trained a custom version of the BERT network and included it into a natural language generation (NLG) application. You will hear how we arrived at the decision to use BERT and what other approaches we tried. We will tell you about the failures and the mistakes we made so you do not have to repeat them, but also about the surprises, successes and lessons learned.
Christoph Henkelmann (DIVISIO) | https://mlconference.ai/speaker/christoph-henkelmann/
🤗 Join us at the next ML Conference | The Conference for Machine Learning Innovation | https://mlconference.ai
👍 Like us on Facebook | https://www.facebook.com/mlconference/
👉 Follow us on Twitter | https://twitter.com/mlconference
Видео From Paper to Product – How we implemented BERT | Christoph Henkelmann канала Machine Learning Conference
In this session, we will tell you how we trained a custom version of the BERT network and included it into a natural language generation (NLG) application. You will hear how we arrived at the decision to use BERT and what other approaches we tried. We will tell you about the failures and the mistakes we made so you do not have to repeat them, but also about the surprises, successes and lessons learned.
Christoph Henkelmann (DIVISIO) | https://mlconference.ai/speaker/christoph-henkelmann/
🤗 Join us at the next ML Conference | The Conference for Machine Learning Innovation | https://mlconference.ai
👍 Like us on Facebook | https://www.facebook.com/mlconference/
👉 Follow us on Twitter | https://twitter.com/mlconference
Видео From Paper to Product – How we implemented BERT | Christoph Henkelmann канала Machine Learning Conference
Показать
Комментарии отсутствуют
Информация о видео
1 апреля 2020 г. 19:30:20
00:49:08
Другие видео канала
BERT Neural Network - EXPLAINED!Automatic Image Cropping for Online Classifieds | Alexey GrigorevEverything You Need to Know about Security Issues in Today’s ML Systems | David GlavasImageGPT (Generative Pre-training from Pixels)Turing Machines Explained - ComputerphileAn Introduction to Natural Language GenerationMultilingual BERT - Part 1 - Intro and ConceptsNLP for Developers: BERT | Rasa13. Speech Recognition with Convolutional Neural Networks in Keras/TensorFlowExploring the Limits of Transfer Learning with a Unified Text-to-Text TransformerFrom NASA to Hollywood - using predictive analytics and machine learningMarianne Stecklina: Why you should (not) train your own BERT model for... | PyData Berlin 2019Artificial Intelligence: it will kill us | Jay Tuck | TEDxHamburgSalonBuild your own Neural Network – with PHP! | Vitor BrandaoScalable and Robust Multi-Agent Reinforcement LearningMachine Learning Conference - The Conference for Machine Learning InnovationExBERT: A Visual Tool to Explore BERTSentiment Analysis | BERT tutorial | Transformer | TensorFlow Keras Python | Text Classification NLPBERT: Pre-training of Deep Bidirectional Transformers for Language UnderstandingIntroduction to Forecasting in Machine Learning and Deep Learning