FastNeRF: High-Fidelity Neural Rendering at 200FPS [Extended]
Watch the condensed version: https://youtu.be/JS5H-Usiphg
Recent work on Neural Radiance Fields (NeRF) showed how neural networks can be used to encode complex 3D environments that can be rendered photorealistically from novel viewpoints. Rendering these images is very computationally demanding and recent improvements are still a long way from enabling interactive rates, even on high-end hardware. Motivated by scenarios on mobile and mixed reality devices, we propose FastNeRF, the first NeRF-based system capable of rendering high fidelity photorealistic images at 200Hz on a high-end consumer GPU. The core of our method is a graphics-inspired factorization that allows for (i) compactly caching a deep radiance map at each position in space, (ii) efficiently querying that map using ray directions to estimate the pixel values in the rendered image. Extensive experiments show that the proposed method is 3000 times faster than the original NeRF algorithm and at least an order of magnitude faster than existing work on accelerating NeRF, while maintaining visual quality and extensibility.
Project website: https://microsoft.github.io/FastNeRF/
Mixed Reality & AI Lab: https://www.microsoft.com/en-us/research/lab/mixed-reality-ai-lab-cambridge/
Видео FastNeRF: High-Fidelity Neural Rendering at 200FPS [Extended] канала Microsoft Research
Recent work on Neural Radiance Fields (NeRF) showed how neural networks can be used to encode complex 3D environments that can be rendered photorealistically from novel viewpoints. Rendering these images is very computationally demanding and recent improvements are still a long way from enabling interactive rates, even on high-end hardware. Motivated by scenarios on mobile and mixed reality devices, we propose FastNeRF, the first NeRF-based system capable of rendering high fidelity photorealistic images at 200Hz on a high-end consumer GPU. The core of our method is a graphics-inspired factorization that allows for (i) compactly caching a deep radiance map at each position in space, (ii) efficiently querying that map using ray directions to estimate the pixel values in the rendered image. Extensive experiments show that the proposed method is 3000 times faster than the original NeRF algorithm and at least an order of magnitude faster than existing work on accelerating NeRF, while maintaining visual quality and extensibility.
Project website: https://microsoft.github.io/FastNeRF/
Mixed Reality & AI Lab: https://www.microsoft.com/en-us/research/lab/mixed-reality-ai-lab-cambridge/
Видео FastNeRF: High-Fidelity Neural Rendering at 200FPS [Extended] канала Microsoft Research
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
PwR: Using representations for AI-powered software developmentFoundation models and the next era of AIAutomating Commonsense ReasoningCausal AI for Decision MakingProject Silica Library 2022At the Frontiers of Science: Exciting Advances in Protein Design3D Face Reconstruction with Dense LandmarksMicrosoft is Accelerating the Future of Aerial Autonomy[VLP Tutorial @ CVPR 2022] Image-Text Pre-training Part IMSR-IISc AI Seminar Series: GFlowNets and System 2 Deep Learning - Yoshua BengioMIT Technology Review’s Future Compute Conference 2022Supercharge A/B testing w/automated causal inference |Community Workshop on Microsoft's Causal ToolsUpdate on Microsoft causal open-source libraries | Community Workshop on Microsoft's Causal ToolsMSR-IISc AI Seminar Series: Where on Earth is AI Headed? - Tom M. MitchellIntroducing the Microsoft Africa Research Institute (MARI)MSR-IISc AI Seminar Series: Learning to Walk - Jitendra MalikA discussion with Sankar Das Sarma and Chetan NayakReinforcement Learning (RL) Open Source Fest 2021 | Final Presentations - Part 1Microsoft Soundscape - an Illustrated DemonstrationMicrosoft Soundscape - overview of Routes featureDeveloping a Brain-Computer Interface Based on Visual Imagery