Загрузка страницы

CURL: Contrastive Unsupervised Representations for Reinforcement Learning

Contrastive Learning has been an established method in NLP and Image classification. The authors show that with relatively minor adjustments, CL can be used to augment and improve RL dramatically.

Paper: https://arxiv.org/abs/2004.04136
Code: https://github.com/MishaLaskin/curl

Abstract:
We present CURL: Contrastive Unsupervised Representations for Reinforcement Learning. CURL extracts high-level features from raw pixels using contrastive learning and performs off-policy control on top of the extracted features. CURL outperforms prior pixel-based methods, both model-based and model-free, on complex tasks in the DeepMind Control Suite and Atari Games showing 2.8x and 1.6x performance gains respectively at the 100K interaction steps benchmark. On the DeepMind Control Suite, CURL is the first image-based algorithm to nearly match the sample-efficiency and performance of methods that use state-based features.

Authors: Aravind Srinivas, Michael Laskin, Pieter Abbeel

Links:
YouTube: https://www.youtube.com/c/yannickilcher
Twitter: https://twitter.com/ykilcher
BitChute: https://www.bitchute.com/channel/yannic-kilcher
Minds: https://www.minds.com/ykilcher

Видео CURL: Contrastive Unsupervised Representations for Reinforcement Learning канала Yannic Kilcher
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
11 апреля 2020 г. 16:04:57
00:28:45
Яндекс.Метрика