Загрузка страницы

225 - Attention U-net. What is attention and why is it needed for U-Net?

What is attention and why is it needed for U-Net?

Attention in U-Net is a method to highlight only the relevant activations during training. It reduces the computational resources wasted on irrelevant activations and provides better generalization of the network.

Two types of attention:

1. Hard attention
Highlight relevant regions by cropping.
One region of an image at a time; this implies it is non differentiable and needs reinforcement learning.
Network can either pay attention or not, nothing in between.
Backpropagation cannot be used.

2. Soft attention
Weighting different parts of the image.
Relevant parts of image get large weights and less relevant parts get small weights.
Can be trained with backpropagation.
During training, the weights also get trained making the model pay more attention to relevant regions.
In summary – it adds weights to pixels based on the relevance.

Why is attention needed in U-Net?
U-net skip connection combines spatial information from the down-sampling path with the up-sampling path to retain good spatial information. But this process brings along the poor feature representation from the initial layers. Soft attention implemented at the skip connections will actively suppress activations at irrelevant regions.

Видео 225 - Attention U-net. What is attention and why is it needed for U-Net? канала DigitalSreeni
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
7 июля 2021 г. 12:00:04
00:14:56
Яндекс.Метрика