Natural Language Processing, GPT-2 and BERT | Interview with Christoph Henkelmann
Christoph Henkelmann (DIVISIO) | https://mlconference.ai/speaker/christoph-henkelmann/
What sets Google’s natural language processing model BERT apart from other language models, how can a custom version version be implemented and what is the so-called ImageNetMoment? We spoke to Christoph Henkelmann about these topics and asked him about his opinion on OpenAI’s GPT-2.
🤗 Join us at the next ML Conference | The Conference for Machine Learning Innovation | https://mlconference.ai
👍 Like us on Facebook | https://www.facebook.com/mlconference/
👉 Follow us on Twitter | https://twitter.com/mlconference
Видео Natural Language Processing, GPT-2 and BERT | Interview with Christoph Henkelmann канала Machine Learning Conference
What sets Google’s natural language processing model BERT apart from other language models, how can a custom version version be implemented and what is the so-called ImageNetMoment? We spoke to Christoph Henkelmann about these topics and asked him about his opinion on OpenAI’s GPT-2.
🤗 Join us at the next ML Conference | The Conference for Machine Learning Innovation | https://mlconference.ai
👍 Like us on Facebook | https://www.facebook.com/mlconference/
👉 Follow us on Twitter | https://twitter.com/mlconference
Видео Natural Language Processing, GPT-2 and BERT | Interview with Christoph Henkelmann канала Machine Learning Conference
Показать
Комментарии отсутствуют
Информация о видео
19 декабря 2019 г. 12:16:44
00:08:31
Другие видео канала
Natural Language Generation at Google ResearchHow to learn any language easily | Matthew Youlden | TEDxClaphamGPT3: An Even Bigger Language Model - ComputerphileFrom Paper to Product – How we implemented BERT | Christoph HenkelmannMachine Learning Conference 2020 in MunichWhat It's Like To be a Computer: An Interview with GPT-3A brief history of the Transformer architecture in NLPNatural Language Processing In 10 Minutes | NLP Tutorial For Beginners | NLP Training | EdurekaHashing Algorithms and Security - ComputerphileGoogle's Deep Mind Explained! - Self Learning A.I.Generate Blog Posts with GPT2 & Hugging Face Transformers | AI Text Generation GPT2-LargeWhat is NLP - Science of Subconscious Mind | VED [in Hindi]Deeksha Yennam: Multilingual embeddings to scale NLP models to multiple... | PyData Austin 2019Unicorn AI - ComputerphileText Generation with Transformers (GPT-2) In 10 Lines Of CodeHigh-Level Tutorial of OpenAI's GPT-3 | GPT-3 VS BERT Family of NLP ModelsWhat is NLP & How Does It Work? Neuro Linguistic Programming BasicsHow to get meaning from text with language model BERT | AI ExplainedStanford CS224N: NLP with Deep Learning | Winter 2020 | BERT and Other Pre-trained Language ModelsLecture 12.3 Famous transformers (BERT, GPT-2, GPT-3)