Event-based Near-eye Gaze Tracking at 10,000 Hz | IEEE VR 2021
Project website: https://www.computationalimaging.org/publications/event-based-eye-tracking/
The cameras in modern gaze-tracking systems suffer from fundamental bandwidth and power limitations, constraining data acquisition speed to 300 Hz realistically. This obstructs the use of mobile eye trackers to perform, e.g., low latency predictive rendering, or to study quick and subtle eye motions like microsaccades using head-mounted devices in the wild. Here, we propose a hybrid frame-event-based near-eye gaze tracking system offering update rates beyond 10,000 Hz with an accuracy that matches that of high-end desktop-mounted commercial trackers when evaluated in the same conditions. Our system builds on emerging event cameras that simultaneously acquire regularly sampled frames and adaptively sampled events. We develop an online 2D pupil fitting method that updates a parametric model every one or few events. Moreover, we propose a polynomial regressor for estimating the point of gaze from the parametric pupil model in real time. Using the first event-based gaze dataset we demonstrate that our system achieves accuracies of 0.45 degrees-1.75 degrees for fields of view from 45 degrees to 98 degrees. With this technology, we hope to enable a new generation of ultra-low-latency gaze-contingent rendering and display techniques for virtual and augmented reality.
Видео Event-based Near-eye Gaze Tracking at 10,000 Hz | IEEE VR 2021 канала Stanford Computational Imaging Lab
The cameras in modern gaze-tracking systems suffer from fundamental bandwidth and power limitations, constraining data acquisition speed to 300 Hz realistically. This obstructs the use of mobile eye trackers to perform, e.g., low latency predictive rendering, or to study quick and subtle eye motions like microsaccades using head-mounted devices in the wild. Here, we propose a hybrid frame-event-based near-eye gaze tracking system offering update rates beyond 10,000 Hz with an accuracy that matches that of high-end desktop-mounted commercial trackers when evaluated in the same conditions. Our system builds on emerging event cameras that simultaneously acquire regularly sampled frames and adaptively sampled events. We develop an online 2D pupil fitting method that updates a parametric model every one or few events. Moreover, we propose a polynomial regressor for estimating the point of gaze from the parametric pupil model in real time. Using the first event-based gaze dataset we demonstrate that our system achieves accuracies of 0.45 degrees-1.75 degrees for fields of view from 45 degrees to 98 degrees. With this technology, we hope to enable a new generation of ultra-low-latency gaze-contingent rendering and display techniques for virtual and augmented reality.
Видео Event-based Near-eye Gaze Tracking at 10,000 Hz | IEEE VR 2021 канала Stanford Computational Imaging Lab
Показать
Комментарии отсутствуют
Информация о видео
22 марта 2021 г. 23:46:11
00:03:33
Другие видео канала
Holographic Near-Eye Displays Based on Overlap-Add StereogramsEccentricity-dependent Spatio-temporal Flicker Fusion for Foveated Graphics | SIGGRAPH 2021Semantic Implicit Neural Scene Representations with Semi-supervised Training | 3DV 2020Gaze-contingent Stereo Rendering for VR/AR | SIGGRAPH Asia 2020Partially-coherent Neural Holography | Science Advances 2021EE267 Getting Started with UnityDeepVoxels: Learning Persistent 3D Feature EmbeddingsHomework 3: Foveated Rendering, Depth of Field, Anaglyph Stereo RenderingTime Multiplexed Coded Aperture Imaging | ICCV 2021Learning to Solve PDE-constrained Inverse Problems with Graph Networks | ICML 2022Fast Training of Neural Lumigraph Representations using Meta Learning | NeurIPS 2021PixelRNN | CVPR 2024Neural Holography | SIGGRAPH 2020 ETech - 3 min overviewTowards Transient Imaging at Interactive Rates with Single-photon Detectors | ICCP 2018Factored Occlusion ARStanford Computational Imaging Lab - Overview 06/2020Gaze Contingent Stereo Rendering | SIGGRAPH Asia 2020Homework 5: Inertial Measurement Units and Sensor FusionTime-multiplexed Neural Holography | SIGGRAPH 2022Neural Holography | SIGGRAPH 2020 ETech - 15 min tech talkFactored Occlusion AR | IEEE VR 2020