Lecture 2 | Word Vector Representations: word2vec
Lecture 2 continues the discussion on the concept of representing words as numeric vectors and popular approaches to designing word vectors.
Key phrases: Natural Language Processing. Word Vectors. Singular Value Decomposition. Skip-gram. Continuous Bag of Words (CBOW). Negative Sampling. Hierarchical Softmax. Word2Vec.
-------------------------------------------------------------------------------
Natural Language Processing with Deep Learning
Instructors:
- Chris Manning
- Richard Socher
Natural language processing (NLP) deals with the key artificial intelligence technology of understanding complex human language communication. This lecture series provides a thorough introduction to the cutting-edge research in deep learning applied to NLP, an approach that has recently obtained very high performance across many different NLP tasks including question answering and machine translation. It emphasizes how to implement, train, debug, visualize, and design neural network models, covering the main technologies of word vectors, feed-forward models, recurrent neural networks, recursive neural networks, convolutional neural networks, and recent models involving a memory component.
For additional learning opportunities please visit:
http://stanfordonline.stanford.edu/
Видео Lecture 2 | Word Vector Representations: word2vec канала Stanford University School of Engineering
Key phrases: Natural Language Processing. Word Vectors. Singular Value Decomposition. Skip-gram. Continuous Bag of Words (CBOW). Negative Sampling. Hierarchical Softmax. Word2Vec.
-------------------------------------------------------------------------------
Natural Language Processing with Deep Learning
Instructors:
- Chris Manning
- Richard Socher
Natural language processing (NLP) deals with the key artificial intelligence technology of understanding complex human language communication. This lecture series provides a thorough introduction to the cutting-edge research in deep learning applied to NLP, an approach that has recently obtained very high performance across many different NLP tasks including question answering and machine translation. It emphasizes how to implement, train, debug, visualize, and design neural network models, covering the main technologies of word vectors, feed-forward models, recurrent neural networks, recursive neural networks, convolutional neural networks, and recent models involving a memory component.
For additional learning opportunities please visit:
http://stanfordonline.stanford.edu/
Видео Lecture 2 | Word Vector Representations: word2vec канала Stanford University School of Engineering
Показать
Комментарии отсутствуют
Информация о видео
4 апреля 2017 г. 0:49:56
01:18:17
Другие видео канала
Lecture 3 | GloVe: Global Vectors for Word RepresentationVectoring Words (Word Embeddings) - Computerphile12.1: What is word2vec? - Programming with Text[Classic] Word2Vec: Distributed Representations of Words and Phrases and their CompositionalityWord2Vec (introduce and tensorflow implementation)Lecture 1 | Natural Language Processing with Deep LearningRobert Meyer - Analysing user comments with Doc2Vec and Machine Learning classificationYou Will Never Be Lazy Again | Jim KwikUnderstanding Word2VecUnderstanding Artificial Intelligence and Its Future | Neil Nie | TEDxDeerfieldWord2Vec - Skipgram and CBOWStanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 1 – Introduction and Word VectorsWord Embedding Explained and Visualized - word2vec and weviWord EmbeddingsWord2Vec Simplified|Word2Vec explained in simple language|CBOW and Skipgrm methods in word2vecAli Ghodsi, Lec [3,1]: Deep Learning, Word2vecML Lecture 14: Unsupervised Learning - Word EmbeddingStanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 6 – Language Models and RNNsLecture 4: Word Window Classification and Neural Networks