Загрузка страницы

Hogwild for Machine Learning on Multicore

This program provides both theoretical and experimental evidence demonstrating the achievement of linear speedups on multicore workstations on several benchmark optimization problems. Stochastic Gradient Descent (SGD) is a popular optimization algorithm for solving data-driven machine learning problems such as classification, model selection, sequence labeling, and recommendation. SGD is well suited to processing large amounts of data due to its robustness against noise, rapid convergence rates, and predictable memory footprint. Learn how SGD can be implemented in parallel with minimal communication, with no locking or synchronization, and with strong spatial locality. Finally, hear a discussion of a challenging problem raised by our implementations relating arithmetic and geometric means of positive definite matrices.

Produced 5/3/2012

Видео Hogwild for Machine Learning on Multicore канала UW Video
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
12 июня 2014 г. 6:13:55
00:51:23
Яндекс.Метрика