Загрузка страницы

Neural Network Pruning for Compression & Understanding | Facebook AI Research | Dr. Michela Paganini

In order to contrast the explosion in size of state-of-the-art machine learning models, and due to the necessity of deploying fast, sustainable, and private on-device models on resource-constrained devices, techniques such as pruning, quantization, and distillation have emerged as strategies for model compression.

Fundamental scientific understanding of the inner workings of neural networks is necessary to build a path towards robust, efficient AI, and I will introduce research and open-source work that has facilitated the investigation of the behavior of pruned models.

#Clarifai provides a platform for #datascientists, developers, researchers, and enterprises to master the entire #artificialintelligence lifecycle. Try our free API and get started with 1,000 free operations each month. Request a free API key at our website, https://www.clarifai.com/, and start building today.

Learn more about Clarifai at: https://www.clarifai.com/.
Sign up for a free account: https://portal.clarifai.com/signup

Видео Neural Network Pruning for Compression & Understanding | Facebook AI Research | Dr. Michela Paganini канала Clarifai
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
22 октября 2020 г. 5:08:05
00:45:07
Яндекс.Метрика