Sentence Similarity With Sentence-Transformers in Python
Full sentence similarity playlist: https://www.youtube.com/watch?v=NNS5pOpjvAQ&list=PLIUOU7oqGTLgz-BI8bNMVGwQxIMuQddJO&index=5
Hard mode: https://youtu.be/jVPd7lEvjtg
All we ever seem to talk about nowadays are BERT this, BERT that. I want to talk about something else, but BERT is just too good - so this video will be about BERT for sentence similarity.
A big part of NLP relies on similarity in highly-dimensional spaces. Typically an NLP solution will take some text, process it to create a big vector/array representing said text - then perform several transformations.
It's highly-dimensional magic.
Sentence similarity is one of the clearest examples of how powerful highly-dimensional magic can be.
The logic is this:
- Take a sentence, convert it into a vector.
- Take many other sentences, and convert them into vectors.
- Find sentences that have the smallest distance (Euclidean) or smallest angle (cosine similarity) between them - more on that here.
- We now have a measure of semantic similarity between sentences - easy!
At a high level, there's not much else to it. But of course, we want to understand what is happening in a little more detail and implement this in Python too.
🤖 70% Discount on the NLP With Transformers in Python course:
https://bit.ly/3DFvvY5
Medium article:
https://towardsdatascience.com/bert-for-measuring-text-similarity-eec91c6bf9e1
🎉 Sign-up For New Articles Every Week on Medium!
https://medium.com/@jamescalam/membership
📖 If membership is too expensive - here's a free link:
https://towardsdatascience.com/bert-for-measuring-text-similarity-eec91c6bf9e1?sk=c0f2990b4660210b447e52d55bd0f4e5
👾 Discord
https://discord.gg/c5QtDB9RAP
🕹️ Free AI-Powered Code Refactoring with Sourcery:
https://sourcery.ai/?utm_source=YouTub&utm_campaign=JBriggs&utm_medium=aff
Видео Sentence Similarity With Sentence-Transformers in Python канала James Briggs
Hard mode: https://youtu.be/jVPd7lEvjtg
All we ever seem to talk about nowadays are BERT this, BERT that. I want to talk about something else, but BERT is just too good - so this video will be about BERT for sentence similarity.
A big part of NLP relies on similarity in highly-dimensional spaces. Typically an NLP solution will take some text, process it to create a big vector/array representing said text - then perform several transformations.
It's highly-dimensional magic.
Sentence similarity is one of the clearest examples of how powerful highly-dimensional magic can be.
The logic is this:
- Take a sentence, convert it into a vector.
- Take many other sentences, and convert them into vectors.
- Find sentences that have the smallest distance (Euclidean) or smallest angle (cosine similarity) between them - more on that here.
- We now have a measure of semantic similarity between sentences - easy!
At a high level, there's not much else to it. But of course, we want to understand what is happening in a little more detail and implement this in Python too.
🤖 70% Discount on the NLP With Transformers in Python course:
https://bit.ly/3DFvvY5
Medium article:
https://towardsdatascience.com/bert-for-measuring-text-similarity-eec91c6bf9e1
🎉 Sign-up For New Articles Every Week on Medium!
https://medium.com/@jamescalam/membership
📖 If membership is too expensive - here's a free link:
https://towardsdatascience.com/bert-for-measuring-text-similarity-eec91c6bf9e1?sk=c0f2990b4660210b447e52d55bd0f4e5
👾 Discord
https://discord.gg/c5QtDB9RAP
🕹️ Free AI-Powered Code Refactoring with Sourcery:
https://sourcery.ai/?utm_source=YouTub&utm_campaign=JBriggs&utm_medium=aff
Видео Sentence Similarity With Sentence-Transformers in Python канала James Briggs
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
![Sentence Similarity With Transformers and PyTorch (Python)](https://i.ytimg.com/vi/jVPd7lEvjtg/default.jpg)
![3 Vector-based Methods for Similarity Search (TF-IDF, BM25, SBERT)](https://i.ytimg.com/vi/ziiF1eFM3_4/default.jpg)
![Sentence Transformers: Sentence-BERT - Sentence Embeddings using Siamese BERT-Networks | arXiv demo](https://i.ytimg.com/vi/4I3gS1cmqe4/default.jpg)
![What is BERT? | Deep Learning Tutorial 46 (Tensorflow, Keras & Python)](https://i.ytimg.com/vi/7kLi8u2dJz0/default.jpg)
![Text Summarization with Google AI's T5 in Python](https://i.ytimg.com/vi/egDIqQIjDCI/default.jpg)
![Calculating Similarity Scores w/ Python | Recommendation Engines In Python #2](https://i.ytimg.com/vi/EKDlBykbFrw/default.jpg)
![Learn Sentence Embedding w/ huggingface pre-trained word/sentence embedding models.](https://i.ytimg.com/vi/6yPWtdgs5Sg/default.jpg)
![BERT Model Architectures For Semantic Similarity](https://i.ytimg.com/vi/D-BlhDFXt30/default.jpg)
![BERT v/s Word2Vec Simplest Example](https://i.ytimg.com/vi/1W-sWmFQPZY/default.jpg)
![Compare the similarity of two Wikipedia's articles using Python Natural language processing](https://i.ytimg.com/vi/ag_TciKDPnc/default.jpg)
![How Google Translate Works - The Machine Learning Algorithm Explained!](https://i.ytimg.com/vi/AIpXjFwVdIE/default.jpg)
![How to extract semantic and meaningful keywords with BERT](https://i.ytimg.com/vi/Ung-SWAEngc/default.jpg)
![Faiss - Introduction to Similarity Search](https://i.ytimg.com/vi/sKyvsdEv6rk/default.jpg)
![Sentence similarity using Gensim & SpaCy in python](https://i.ytimg.com/vi/Il04RjS-9-8/default.jpg)
![Transformers, explained: Understand the model behind GPT, BERT, and T5](https://i.ytimg.com/vi/SZorAJ4I-sA/default.jpg)
![Text Classification Using BERT & Tensorflow | Deep Learning Tutorial 47 (Tensorflow, Keras & Python)](https://i.ytimg.com/vi/hOCDJyZ6quA/default.jpg)
![Python Tutorial for Absolute Beginners #1 - What Are Variables?](https://i.ytimg.com/vi/Z1Yd7upQsXY/default.jpg)
![Training BERT #5 - Training With BertForPretraining](https://i.ytimg.com/vi/IC9FaVPKlYc/default.jpg)
![HNSW for Vector Search Explained and Implemented with Faiss (Python)](https://i.ytimg.com/vi/QvKMwLjdK-s/default.jpg)
![Sentence Similarity using HuggingFace's Sentence Transformers v2](https://i.ytimg.com/vi/iBY-PqJAAsU/default.jpg)