Rethinking Pre-training and Self-Training
**ERRATA** at 9:31 I called the large scale jittering "color jittering", this isn't an operation specifically on colors.
This video explores an interesting paper from researchers at Google AI. They show that self-training outperforms supervised or self-supervised (SimCLR) pre-training. The video explains what self-training is and how all these methods attempt to utilize extra data (labeled or not) for better performance on downstream tasks.
Thanks for watching! Please Subscribe!
Paper Links:
Rethinking Pre-training and Self-training: https://arxiv.org/pdf/2006.06882.pdf
OpenImages Dataset:https://storage.googleapis.com/openimages/web/index.html
RetinaNet: https://arxiv.org/pdf/1708.02002.pdf
Rethinking ImageNet Pre-training: https://arxiv.org/pdf/1811.08883.pdf
Image Classification State-of-the-Art: https://paperswithcode.com/sota/image-classification-on-imagenet
Self-Training with Noisy Student: https://arxiv.org/pdf/1911.04252.pdf
Rotation Self-Supervised Learning: https://arxiv.org/pdf/1803.07728.pdf
POET: https://arxiv.org/pdf/1901.01753.pdf
ImageGPT: https://openai.com/blog/image-gpt/
Видео Rethinking Pre-training and Self-Training канала Connor Shorten
This video explores an interesting paper from researchers at Google AI. They show that self-training outperforms supervised or self-supervised (SimCLR) pre-training. The video explains what self-training is and how all these methods attempt to utilize extra data (labeled or not) for better performance on downstream tasks.
Thanks for watching! Please Subscribe!
Paper Links:
Rethinking Pre-training and Self-training: https://arxiv.org/pdf/2006.06882.pdf
OpenImages Dataset:https://storage.googleapis.com/openimages/web/index.html
RetinaNet: https://arxiv.org/pdf/1708.02002.pdf
Rethinking ImageNet Pre-training: https://arxiv.org/pdf/1811.08883.pdf
Image Classification State-of-the-Art: https://paperswithcode.com/sota/image-classification-on-imagenet
Self-Training with Noisy Student: https://arxiv.org/pdf/1911.04252.pdf
Rotation Self-Supervised Learning: https://arxiv.org/pdf/1803.07728.pdf
POET: https://arxiv.org/pdf/1901.01753.pdf
ImageGPT: https://openai.com/blog/image-gpt/
Видео Rethinking Pre-training and Self-Training канала Connor Shorten
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
Collaboration of ExpertsAI Weekly Update #12 - November 18th, 2019Healthsea from Spacy on HuggingFace SpacesDL Podcast #3 | Yannic Kilcher | Population-Based SearchDeep Learning Podcast #2 | Edward Peake | Deep Learning in Medical ImagingLong-Short TransformerKnowledge Distillation - Keras Code ExamplesApproximate Nearest Neighbor Benchmarks - Weaviate Podcast RecapBinary Passage Retrieval in Weaviate (32x Memory Savings)Beyond Goldfish Memory!AI Weekly Update 2.0Efficient Transfer Learning with Null PromptsGoogle Research at ICCVDeep Learning for Podcast Content Search (Summary of Interview with Alex Canan at Zencastr)Full Stack Neural SearchEvolving Normalization-Activation LayersCoDA: Contrast-Enhancing and Diversity-Promoting Data Augmentation for NLUHow recent papers from OpenAI may come togetherAI Weekly Update - March 15th, 20201 (#28)!MultiCite - New Research in Scientific Literature Mining!AI Weekly Update - January 27th, 2020 (#14)