Загрузка страницы

Download pre-trained BERT models - at HuggingFace - incl. Sentence Transformers Models (SBERT 21)

New to coding artificial intelligence? Bidirectional Encoder Representations from Transformers (or BERT) is a transformer-based machine learning technique for natural language processing - NLP.

Unfamiliar with the benefits of HuggingFace ? Learn to apply its pretrained models and accelerate your ML training!

No problem. This is a short intro to choose a pretrained transformer model for NLP. Thousands of pretrained models are available on HuggingFace. An open source AI platform in natural language processing.

If you are a beginner, there is a simple way to explore the different models. Pretrained Transformer models are searchable for their properties and intended applications.

Pretrained BERT models with its characteristics are available to choose from, select if you use TensorFlow or PyTorch and your specific language.

Pretrained BERT models like BERT_base_uncased or Roberta_large are available, plus GPT2 or XLM, XLNet ...

You will find out on what datasets these models have been trained on and their specific Transformer architecture.

00:00 Welcome
02:10 HuggingFace BERT models
06:05 Sentence Transformer models
free pre-trained Transformer models of HuggingFace.
Apply knowledge encoding of already pre-trained BERT models.
Limitations of pre-trained transformer models.
open source in natural language processing.

#Open_Source
#HuggingFace
#BERT

Видео Download pre-trained BERT models - at HuggingFace - incl. Sentence Transformers Models (SBERT 21) канала code_your_own_AI
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
22 октября 2021 г. 16:15:01
00:09:33
Яндекс.Метрика