Word2Vec | Calculating Gradients for Word Embedding Optimization with the Skip-Gram Model | Part 1
In this video I go over how cross entropy is used as a loss function, and then go over step by step the math needed to calculate the gradients by which our weights will be optimized.
Видео Word2Vec | Calculating Gradients for Word Embedding Optimization with the Skip-Gram Model | Part 1 канала Omnology
Видео Word2Vec | Calculating Gradients for Word Embedding Optimization with the Skip-Gram Model | Part 1 канала Omnology
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
Coding Tidbit | Visualizing Insertion Sort in PythonCoding Tidbit | Visualizing Bubble Sort in PythonSemantics with Word2Vec | The Skip-Gram Model, Some Probability, and Softmax ActivationIntro to Natural Language Processing | Word Embeddings and Math With WordsEntropy, Cross Entropy, and Kullback-Leibler DivergenceCoding Tidbit | Visualizing Merge Sort in Python