Загрузка страницы

TidyTuesday: Neural Network Regularization with Keras

In this week's #TidyTuesday video, I go over some common techniques to prevent overfitting neural networks. I demonstrate what an overfitted neural network looks like by viewing a model's loss history. I then show how to regularize a model by reducing the parameters, adding L1 and L2 penalties, adding dropout and batch normalization, and utilizing callbacks.

#NeuralNetworks #DataScience

Connect with me on LinkedIn: https://www.linkedin.com/in/andrew-couch/
Q&A Submission Form: https://forms.gle/6EzU4GCR9VnJx8gg7
Code for this video: https://github.com/andrew-couch/Tidy-Tuesday/blob/master/Season%201/Scripts/TidyTuesdayNNetRegularization.Rmd
TidyTuesday: https://github.com/rfordatascience/tidytuesday

PC Setup (Amazon Affiliates)
Keyboard: https://amzn.to/3Bbbk3T
Mouse: https://amzn.to/3BcRGVo
Microphone: https://amzn.to/3ePo9JS
Audio Interface: https://amzn.to/3qTAmjz
Webcam: https://amzn.to/3L9Ql6j
CPU: https://amzn.to/3qGa6Zu
GPU: https://amzn.to/3DnhMHL
RAM: https://amzn.to/3LdTxh7

Видео TidyTuesday: Neural Network Regularization with Keras канала Andrew Couch
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
18 ноября 2020 г. 6:30:02
00:11:50
Яндекс.Метрика