Time-multiplexed Neural Holography | SIGGRAPH 2022
Project website: https://www.computationalimaging.org/publications/time-multiplexed-neural-holography/
Abstract:
Holographic near-eye displays offer unprecedented capabilities for virtual and augmented reality systems, including perceptually important focus cues. Although artificial intelligence–driven algorithms for computer-generated holography (CGH) have recently made much progress in improving the image quality and synthesis efficiency of holograms, these algorithms are not directly applicable to emerging phase-only spatial light modulators (SLM) that are extremely fast but offer phase control with very limited precision. The speed of these SLMs offers time multiplexing capabilities, essentially enabling partially-coherent holographic display modes. Here we report advances in camera-calibrated wave propagation models for these types of near-eye holographic displays and we develop a CGH framework that robustly optimizes the heavily quantized phase patterns of fast SLMs. Our framework is flexible in supporting runtime supervision with different types of content, including 2D and 2.5D RGBD images, 3D focal stacks, and 4D light fields. Using our framework, we demonstrate state-of-the-art results for all of these scenarios in simulation and experiment.
Видео Time-multiplexed Neural Holography | SIGGRAPH 2022 канала Stanford Computational Imaging Lab
Abstract:
Holographic near-eye displays offer unprecedented capabilities for virtual and augmented reality systems, including perceptually important focus cues. Although artificial intelligence–driven algorithms for computer-generated holography (CGH) have recently made much progress in improving the image quality and synthesis efficiency of holograms, these algorithms are not directly applicable to emerging phase-only spatial light modulators (SLM) that are extremely fast but offer phase control with very limited precision. The speed of these SLMs offers time multiplexing capabilities, essentially enabling partially-coherent holographic display modes. Here we report advances in camera-calibrated wave propagation models for these types of near-eye holographic displays and we develop a CGH framework that robustly optimizes the heavily quantized phase patterns of fast SLMs. Our framework is flexible in supporting runtime supervision with different types of content, including 2D and 2.5D RGBD images, 3D focal stacks, and 4D light fields. Using our framework, we demonstrate state-of-the-art results for all of these scenarios in simulation and experiment.
Видео Time-multiplexed Neural Holography | SIGGRAPH 2022 канала Stanford Computational Imaging Lab
Показать
Комментарии отсутствуют
Информация о видео
4 мая 2022 г. 21:00:25
00:06:26
Другие видео канала
Holographic Near-Eye Displays Based on Overlap-Add StereogramsEccentricity-dependent Spatio-temporal Flicker Fusion for Foveated Graphics | SIGGRAPH 2021Semantic Implicit Neural Scene Representations with Semi-supervised Training | 3DV 2020Gaze-contingent Stereo Rendering for VR/AR | SIGGRAPH Asia 2020Partially-coherent Neural Holography | Science Advances 2021EE267 Getting Started with UnityDeepVoxels: Learning Persistent 3D Feature EmbeddingsHomework 3: Foveated Rendering, Depth of Field, Anaglyph Stereo RenderingTime Multiplexed Coded Aperture Imaging | ICCV 2021Learning to Solve PDE-constrained Inverse Problems with Graph Networks | ICML 2022Fast Training of Neural Lumigraph Representations using Meta Learning | NeurIPS 2021PixelRNN | CVPR 2024Neural Holography | SIGGRAPH 2020 ETech - 3 min overviewTowards Transient Imaging at Interactive Rates with Single-photon Detectors | ICCP 2018Factored Occlusion ARStanford Computational Imaging Lab - Overview 06/2020Gaze Contingent Stereo Rendering | SIGGRAPH Asia 2020Homework 5: Inertial Measurement Units and Sensor FusionNeural Holography | SIGGRAPH 2020 ETech - 15 min tech talkFactored Occlusion AR | IEEE VR 2020