Lecture 3 | GloVe: Global Vectors for Word Representation
Lecture 3 introduces the GloVe model for training word vectors. Then it extends our discussion of word vectors (interchangeably called word embeddings) by seeing how they can be evaluated intrinsically and extrinsically. As we proceed, we discuss the example of word analogies as an intrinsic evaluation technique and how it can be used to tune word embedding techniques. We then discuss training model weights/parameters and word vectors for extrinsic tasks. Lastly we motivate artificial neural networks as a class of models for natural language processing tasks.
Key phrases: Global Vectors for Word Representation (GloVe). Intrinsic and extrinsic evaluations. Effect of hyperparameters on analogy evaluation tasks. Correlation of human judgment with word vector distances. Dealing with ambiguity in word using contexts. Window classification.
-------------------------------------------------------------------------------
Natural Language Processing with Deep Learning
Instructors:
- Chris Manning
- Richard Socher
Natural language processing (NLP) deals with the key artificial intelligence technology of understanding complex human language communication. This lecture series provides a thorough introduction to the cutting-edge research in deep learning applied to NLP, an approach that has recently obtained very high performance across many different NLP tasks including question answering and machine translation. It emphasizes how to implement, train, debug, visualize, and design neural network models, covering the main technologies of word vectors, feed-forward models, recurrent neural networks, recursive neural networks, convolutional neural networks, and recent models involving a memory component.
For additional learning opportunities please visit:
http://online.stanford.edu/
Видео Lecture 3 | GloVe: Global Vectors for Word Representation канала Stanford University School of Engineering
Key phrases: Global Vectors for Word Representation (GloVe). Intrinsic and extrinsic evaluations. Effect of hyperparameters on analogy evaluation tasks. Correlation of human judgment with word vector distances. Dealing with ambiguity in word using contexts. Window classification.
-------------------------------------------------------------------------------
Natural Language Processing with Deep Learning
Instructors:
- Chris Manning
- Richard Socher
Natural language processing (NLP) deals with the key artificial intelligence technology of understanding complex human language communication. This lecture series provides a thorough introduction to the cutting-edge research in deep learning applied to NLP, an approach that has recently obtained very high performance across many different NLP tasks including question answering and machine translation. It emphasizes how to implement, train, debug, visualize, and design neural network models, covering the main technologies of word vectors, feed-forward models, recurrent neural networks, recursive neural networks, convolutional neural networks, and recent models involving a memory component.
For additional learning opportunities please visit:
http://online.stanford.edu/
Видео Lecture 3 | GloVe: Global Vectors for Word Representation канала Stanford University School of Engineering
Показать
Комментарии отсутствуют
Информация о видео
4 апреля 2017 г. 0:49:56
01:18:40
Другие видео канала
Word2Vec Easily Explained- Data ScienceWord2Vec Simplified|Word2Vec explained in simple language|CBOW and Skipgrm methods in word2vecIntroduction to GloVe : Global Vectors for Word RepresentationLecture 1 | Natural Language Processing with Deep LearningLecture 4: Word Window Classification and Neural NetworksWord2Vec - Skipgram and CBOWNLP for Developers: Word Embeddings | RasaWord Embeddings12.1: What is word2vec? - Programming with TextHow does Netflix recommend movies? Matrix FactorizationGloVe: Global Vectors for Word Representation - Paper OverviewLecture 2 | Word Vector Representations: word2vecUnderstanding Artificial Intelligence and Its Future | Neil Nie | TEDxDeerfieldStanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 8 – Translation, Seq2Seq, AttentionRasa Algorithm Whiteboard - Understanding Word Embeddings 3: GloVeOne Simple Method to Learn Any Language | Scott Young & Vat Jaiswal | TEDxEastsidePrep[Classic] Word2Vec: Distributed Representations of Words and Phrases and their CompositionalityIntroduction to NLP | GloVe Model ExplainedML Classification using GloVe Vectors & Keras ❌NLP Project in Python with GloVe, TensorFlow & Keras