FastNeRF: High-Fidelity Neural Rendering at 200FPS [Condensed]
Watch the extended version: https://youtu.be/mi5b142WEmw
Recent work on Neural Radiance Fields (NeRF) showed how neural networks can be used to encode complex 3D environments that can be rendered photorealistically from novel viewpoints. Rendering these images is very computationally demanding and recent improvements are still a long way from enabling interactive rates, even on high-end hardware. Motivated by scenarios on mobile and mixed reality devices, we propose FastNeRF, the first NeRF-based system capable of rendering high fidelity photorealistic images at 200Hz on a high-end consumer GPU. The core of our method is a graphics-inspired factorization that allows for (i) compactly caching a deep radiance map at each position in space, (ii) efficiently querying that map using ray directions to estimate the pixel values in the rendered image. Extensive experiments show that the proposed method is 3000 times faster than the original NeRF algorithm and at least an order of magnitude faster than existing work on accelerating NeRF, while maintaining visual quality and extensibility.
Project website: https://microsoft.github.io/FastNeRF/
Mixed Reality & AI Lab: https://www.microsoft.com/en-us/research/lab/mixed-reality-ai-lab-cambridge/
Видео FastNeRF: High-Fidelity Neural Rendering at 200FPS [Condensed] канала Microsoft Research
Recent work on Neural Radiance Fields (NeRF) showed how neural networks can be used to encode complex 3D environments that can be rendered photorealistically from novel viewpoints. Rendering these images is very computationally demanding and recent improvements are still a long way from enabling interactive rates, even on high-end hardware. Motivated by scenarios on mobile and mixed reality devices, we propose FastNeRF, the first NeRF-based system capable of rendering high fidelity photorealistic images at 200Hz on a high-end consumer GPU. The core of our method is a graphics-inspired factorization that allows for (i) compactly caching a deep radiance map at each position in space, (ii) efficiently querying that map using ray directions to estimate the pixel values in the rendered image. Extensive experiments show that the proposed method is 3000 times faster than the original NeRF algorithm and at least an order of magnitude faster than existing work on accelerating NeRF, while maintaining visual quality and extensibility.
Project website: https://microsoft.github.io/FastNeRF/
Mixed Reality & AI Lab: https://www.microsoft.com/en-us/research/lab/mixed-reality-ai-lab-cambridge/
Видео FastNeRF: High-Fidelity Neural Rendering at 200FPS [Condensed] канала Microsoft Research
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
Jon Barron - Understanding and Extending Neural Radiance FieldsThis Neural Network Creates 3D Objects From Your PhotosNeural Sparse Voxel Fields (NeurIPS 2020)KiloNeRF: Speeding up Neural Radiance Fields with Thousands of Tiny MLPsDifferentiable Rendering is Amazing!Local Optimization for Robust Signed Distance Field CollisionWhy Neural Rendering is getting more amazing every day! -- Matthias Niessner (12/08/2020)NIANTIC — self supervised multi frame monocular depthMechanical principles part 01NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis (ML Research Paper Explained)pixelNeRF: Neural Radiance Fields from One or Few ImagesBARF 🤮: Bundle-Adjusting Neural Radiance Fields (ICCV 2021 oral)NeRD: Neural Reflectance Decomposition from Image CollectionsPlenOctrees for Real-time Rendering of Neural Radiance FieldsThe incredible inventions of intuitive AI | Maurice ContiThis Neural Network Learned To Look Around In Real Scenes!NeRF: Neural Radiance FieldsAI Generates 3D high-resolution reconstructions of people from 2D images | Introduction to PIFuHDFastNeRF: High-Fidelity Neural Rendering at 200FPS [Extended]