Загрузка страницы

Evolving Normalization-Activation Layers

Normalization and activation layers have seen a long history of hand-crafted variants with various results. This paper proposes an evolutionary search to determine the ultimate, final and best combined normalization-activation layer... in a very specific setting.

https://arxiv.org/abs/2004.02967

Abstract:
Normalization layers and activation functions are critical components in deep neural networks that frequently co-locate with each other. Instead of designing them separately, we unify them into a single computation graph, and evolve its structure starting from low-level primitives. Our layer search algorithm leads to the discovery of EvoNorms, a set of new normalization-activation layers that go beyond existing design patterns. Several of these layers enjoy the property of being independent from the batch statistics. Our experiments show that EvoNorms not only excel on a variety of image classification models including ResNets, MobileNets and EfficientNets, but also transfer well to Mask R-CNN for instance segmentation and BigGAN for image synthesis, outperforming BatchNorm and GroupNorm based layers by a significant margin in many cases.

Authors: Hanxiao Liu, Andrew Brock, Karen Simonyan, Quoc V. Le

Links:
YouTube: https://www.youtube.com/c/yannickilcher
Twitter: https://twitter.com/ykilcher
BitChute: https://www.bitchute.com/channel/yannic-kilcher
Minds: https://www.minds.com/ykilcher

Видео Evolving Normalization-Activation Layers канала Yannic Kilcher
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
9 апреля 2020 г. 16:10:44
00:30:38
Яндекс.Метрика