Загрузка страницы

Intro to Sentence Embeddings with Transformers

Transformers have wholly rebuilt the landscape of natural language processing (NLP). Before transformers, we had okay translation and language classification thanks to recurrent neural nets (RNNs) - their language comprehension was limited and led to many minor mistakes, and coherence over larger chunks of text was practically impossible.

Since the introduction of the first transformer model in the 2017 paper 'Attention is all you need', NLP has moved from RNNs to models like BERT and GPT. These new models can answer questions, write articles (maybe GPT-3 wrote this), enable incredibly intuitive semantic search - and much more.

In this video, we will explore how these embeddings have been adapted and applied to a range of semantic similarity applications by using a new breed of transformers called 'sentence transformers'.

🌲 Pinecone article:
https://www.pinecone.io/learn/sentence-embeddings/

Vectors in ML:
https://www.youtube.com/playlist?list=PLIUOU7oqGTLgz-BI8bNMVGwQxIMuQddJO

🤖 70% Discount on the NLP With Transformers in Python course:
https://bit.ly/3DFvvY5

🎉 Subscribe for Article and Video Updates!
https://jamescalam.medium.com/subscribe
https://medium.com/@jamescalam/membership

👾 Discord:
https://discord.gg/c5QtDB9RAP

Видео Intro to Sentence Embeddings with Transformers канала James Briggs
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
20 октября 2021 г. 22:06:20
00:31:06
Яндекс.Метрика