Загрузка...

CSCI 3151 - M45 - Modern CNNs: ResNets and beyond

This module builds on the basic CNN concepts from M44 to explore modern convolutional architectures, focusing on residual networks (ResNets). We start from the degradation problem in deep plain CNNs—where adding layers can make training worse—and show how identity skip connections reshape gradient flow so that very deep models become trainable.

Using small PyTorch examples on CIFAR-10, we compare a plain CNN and a simple ResNet-style model in terms of parameter count, training and validation behaviour, and gradient norms, connecting these observations back to earlier modules on vanishing/exploding gradients and generalization. We then survey a tiny set of design patterns—basic and bottleneck residual blocks, and one efficiency-oriented variant—to illustrate how modern CNNs trade off depth, width, and computation. By the end, students should be able to explain the role of residual connections, interpret simple ResNet diagrams, and reason about when deeper CNNs are likely to help or hurt in practice.

Course module page:
https://web.cs.dal.ca/~rudzicz/Teaching/CSCI3151/2026/index.html#module=3151-M45-modern-cnns

Видео CSCI 3151 - M45 - Modern CNNs: ResNets and beyond канала Atlantic AI Institute
Яндекс.Метрика
Все заметки Новая заметка Страницу в заметки
Страницу в закладки Мои закладки
На информационно-развлекательном портале SALDA.WS применяются cookie-файлы. Нажимая кнопку Принять, вы подтверждаете свое согласие на их использование.
О CookiesНапомнить позжеПринять