Загрузка страницы

End-to-End: Automated Hyperparameter Tuning For Deep Neural Networks

In this video, I am going to show you how you can do #HyperparameterOptimization for a #NeuralNetwork automatically using Optuna. This is an end-to-end video in which I select a problem and design a neural network in #PyTorch and then I find the optimal number of layers, drop out, learning rate, and other parameters using Optuna.

The dataset used in this video can be found here: https://www.kaggle.com/c/lish-moa

Please subscribe and like the video to help me keep motivated to make awesome videos like this one. :)

00:00 Introduction
01:56 Dataset class
06:17 Start with train.py
08:19 Cross-validation folds
13:38 Reading the data
24:10 Engine
29:48 Model
35:10 Add model and engine to training
43:05 Optuna
49:02 Start tuning with Optuna
52:50 Training, suggestions and outro

To buy my book, Approaching (Almost) Any Machine Learning problem, please visit: https://bit.ly/buyaaml

Follow me on:
Twitter: https://twitter.com/abhi1thakur
LinkedIn: https://www.linkedin.com/in/abhi1thakur/
Kaggle: https://kaggle.com/abhishek
Instagram: https://instagram.com/abhi4ml

Видео End-to-End: Automated Hyperparameter Tuning For Deep Neural Networks канала Abhishek Thakur
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
25 сентября 2020 г. 21:15:13
00:55:37
Яндекс.Метрика