Загрузка страницы

Optimizers - EXPLAINED!

From Gradient Descent to Adam. Here are some optimizers you should know. And an easy way to remember them.

SUBSCRIBE to my channel for more good stuff!

REFERENCES

[1] Have fun plotting equations : https://academo.org/demos/3d-surface-plotter
[2] Original paper on the Adam optimizer: https://arxiv.org/pdf/1412.6980.pdf
[3] Blog on types of optimizers: https://towardsdatascience.com/types-of-optimization-algorithms-used-in-neural-networks-and-ways-to-optimize-gradient-95ae5d39529f
[4] Blog on optimizing gradient descent: https://ruder.io/optimizing-gradient-descent/index.html#adagrad
[5] Github gist of code for rending animation of a math function: https://gist.github.com/ajhalthor/33533b4673ad6955e08a4005850b512f
[6] Another Blog to quench your thirst for knowledge on optimizers cuz the other links weren't good enough: https://machinelearningmastery.com/adam-optimization-algorithm-for-deep-learning/

Видео Optimizers - EXPLAINED! канала CodeEmporium
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
10 февраля 2020 г. 20:00:18
00:07:23
Яндекс.Метрика