Загрузка страницы

Competition Winning Learning Rates

Leslie Smith, Senior Research Scientist, US Naval Research Laboratory
Presented at MLconf 2018
Abstract: It is well known that learning rates are the most important hyper-parameter to tune for training deep neural networks. Surprisingly, training with dynamic learning rates can lead to an order of magnitude speedup in training time. This talk will discuss my path from static learning rates to dynamic cyclical learning rates and finally to fast training with very large learning rates (I named this technique “super-convergence”). In particular, I will show that very large learning rates are the preferred method for regularizing the training because they provide the twin benefits of training speed and good generalization. The super-convergence method was integrated into the fast.ai library and the Fastai team used it to win the DAWNBench and Kaggle’s iMaterialist challenges.

See Leslie's presentation slides on our slideshare page here: https://www.slideshare.net/SessionsEvents/competition-winning-learning-rates

Видео Competition Winning Learning Rates канала MLconf
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
5 декабря 2018 г. 1:56:17
00:24:29
Яндекс.Метрика