Загрузка страницы

Understanding Coordinate Descent

Course link: https://www.coursera.org/learn/ml-regression

let's just have a little aside on the coordinate decent algorithm, and then we're gonna describe how to apply coordinate descent to solving our lasso objective. So, our goal here is to minimize sub function g. So, this is the same objective that we have whether we are talking about our closed form solution, gradient descent, or this coordinate descent algorithm. But, let me just be very explicit, where. We're saying we wanna minimize over all possible w some g(w), where here, we're assuming g(w) is function of multiple variables. Let's call it g(w0,w1,...,wD). So this W we are trying to write in some bold font here. And often, minimizing over a large set of variables can be a very challenging problem. But in contrast, often it's possible to think about optimizing just a single dimension, keeping all of the other dimensions fixed. So easy for each coordinate when keeping others fixed, because that turns into just a 1D optimization problem. And so, that's the motivation behind coordinate decent, where the coordinate descent algorithm, it's really intuitive.

Видео Understanding Coordinate Descent канала Machine Learning TV
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
21 октября 2022 г. 23:29:51
00:05:59
Яндекс.Метрика