Загрузка...

CSCI 3151 - M20 - Support vector machines & margins

This module introduces support vector machines (SVMs) as maximum-margin classifiers building on linear models and logistic regression. We develop the geometric idea of a margin and support vectors, then derive and interpret the hard-margin and soft-margin SVM optimization problems, connecting the penalty parameter C to regularization, margin width, and training error. In the dual view, we show that only support vectors matter and that kernels (linear and RBF) enter through inner products, revisiting the kernel trick from the previous module. Using low-dimensional synthetic datasets, we visualize decision boundaries, margins, and support vectors while exploring how choices of C and gamma change model complexity, number of support vectors, and generalization, including a small grid search over hyperparameters. We also examine a failure case where extreme hyperparameters lead to overfitting, and emphasize practical issues such as feature scaling, leakage-free pipelines, proper train/validation/test splits, and appropriate evaluation metrics. By the end, students should be able to implement and tune linear and kernel SVMs in scikit-learn and understand when margin-based methods are an appropriate choice.

Course module page:
https://web.cs.dal.ca/~rudzicz/Teaching/CSCI3151/2026/index.html#module=3151-M20-svm-margin

Видео CSCI 3151 - M20 - Support vector machines & margins канала Atlantic AI Institute
Яндекс.Метрика
Все заметки Новая заметка Страницу в заметки
Страницу в закладки Мои закладки
На информационно-развлекательном портале SALDA.WS применяются cookie-файлы. Нажимая кнопку Принять, вы подтверждаете свое согласие на их использование.
О CookiesНапомнить позжеПринять