Загрузка страницы

Multilingual BERT - Part 1 - Intro and Concepts

This series will provide an introduction to some of the key concepts and techniques around applying BERT to another language, as well as example code implementations (built on PyTorch and huggingface/transformers).

In Part 1, I'll explain the difference between a "monolingual" model and a "multilingual" model (and why monolingual isn't just the obvious choice!), and introduce the concept of Cross-Lingual Transfer. I'll also describe a couple ways in which we can leverage Machine Translation.

I've also published a blog post version of this tutorial here: http://mccormickml.com/2020/10/05/multilingual-bert/

==== Playlist ====
https://www.youtube.com/playlist?list=PLam9sigHPGwM27p3FQpLK1nt0eioiM-cq

==== Receive Updates ====
Sign up to hear about new content across my blog and channel: https://www.chrismccormick.ai/subscribe

Видео Multilingual BERT - Part 1 - Intro and Concepts канала ChrisMcCormickAI
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
6 октября 2020 г. 20:50:43
00:11:29
Яндекс.Метрика