Загрузка страницы

Distilling the Knowledge in a Neural Network

This is the first and foundational paper that started the research area of Knowledge Distillation.

Knowledge Distillation is a study of methods and techniques to extract the information from a cumbersome model (also called the Teacher model) and provide it to a simpler model (also called the Student model). Student models are the ones that are used for inference (especially on resource-constrained devices) and are supposed to excel at both accuracy and speed of prediction

Link to the paper:
https://arxiv.org/abs/1503.02531

Link to the summary of the paper:
https://towardsdatascience.com/paper-summary-distilling-the-knowledge-in-a-neural-network-dc8efd9813cc

#KnowledgeDistillation
#deeplearning
#softmax
#machinelearning

Видео Distilling the Knowledge in a Neural Network канала Kapil Sachdeva
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
28 июня 2020 г. 6:12:52
00:19:05
Яндекс.Метрика