Real-time Visual-Inertial Odometry for Event Cameras using Keyframe-based Nonlinear Optimization
Event cameras are bio-inspired vision sensors that output pixel-level brightness changes instead of standard intensity frames. They offer significant advantages over standard cameras, namely a very high dynamic range, no motion blur, and a latency in the order of microseconds. We propose a novel, accurate tightly-coupled visual-inertial odometry pipeline for such cameras that leverages their outstanding properties to estimate the camera ego-motion in challenging conditions, such as high-speed motion or high dynamic range scenes. The method tracks a set of features (extracted on the image plane) through time. To achieve that, we consider events in overlapping spatio-temporal windows and align them using the current camera motion and scene structure, yielding motion-compensated event frames. We then combine these feature tracks in a keyframe-based, visual-inertial odometry algorithm based on nonlinear optimization to estimate the camera’s 6-DOF pose, velocity, and IMU biases. The proposed method is evaluated quantitatively on the public Event Camera Dataset and significantly outperforms the state-of-the-art, while being computationally much more efficient: our pipeline can run much faster than real-time on a laptop and even on a smartphone processor. Furthermore, we demonstrate qualitatively the accuracy and robustness of our pipeline on a large-scale dataset, and an extremely high-speed dataset recorded by spinning an event camera on a leash at 850 deg/s.
Reference:
Rebecq, Horstschaefer, Scaramuzza
Real-time Visual-Inertial Odometry for Event Cameras using Keyframe-based Nonlinear Optimization
PDF: http://rpg.ifi.uzh.ch/docs/BMVC17_Rebecq.pdf
Our research page on event based vision:
http://rpg.ifi.uzh.ch/research_dvs.html
For event-camera datasets and event camera simulator, see here: http://rpg.ifi.uzh.ch/davis_data.html
Other resources on event cameras (publications, software, drivers, where to buy, etc.):
https://github.com/uzh-rpg/event-based_vision_resources
Robotics and Perception Group, University of Zurich, 2017
http://rpg.ifi.uzh.ch/
Видео Real-time Visual-Inertial Odometry for Event Cameras using Keyframe-based Nonlinear Optimization канала UZH Robotics and Perception Group
Reference:
Rebecq, Horstschaefer, Scaramuzza
Real-time Visual-Inertial Odometry for Event Cameras using Keyframe-based Nonlinear Optimization
PDF: http://rpg.ifi.uzh.ch/docs/BMVC17_Rebecq.pdf
Our research page on event based vision:
http://rpg.ifi.uzh.ch/research_dvs.html
For event-camera datasets and event camera simulator, see here: http://rpg.ifi.uzh.ch/davis_data.html
Other resources on event cameras (publications, software, drivers, where to buy, etc.):
https://github.com/uzh-rpg/event-based_vision_resources
Robotics and Perception Group, University of Zurich, 2017
http://rpg.ifi.uzh.ch/
Видео Real-time Visual-Inertial Odometry for Event Cameras using Keyframe-based Nonlinear Optimization канала UZH Robotics and Perception Group
Показать
Комментарии отсутствуют
Информация о видео
12 сентября 2017 г. 13:07:25
00:03:03
Другие видео канала
DSO: Direct Sparse OdometryEvent-based Vision for Autonomous High-Speed RoboticsAI Drone faster than Humans? Time-Optimal Planning for Quadrotor Waypoint FlightSensor Fusion in Mobile Autonomous Robot | ROS | IMU+Wheel Odometry | Kalman Fliter | Jetson NanoThe Immune System Explained I – Bacteria InfectionEvent-driven Vision and Control for UAVs on a Neuromorphic Chip (Intel Loihi) - ICRA 2021Event Cameras: Opportunities and the Road Ahead (CVPR 2020)Build your own visual-inertial odometry aided cost-effective open-source autonomous drone.New College SFM (source code available)Lidar vs Vslam (cameras vs lasers) For Robot Vacuums - Which One is Best?[Open Source] VINS-Mono: Monocular Visual-Inertial System Indoor and Outdoor PerformanceAutonomous Overtaking in Gran Turismo Sport Using Curriculum Reinforcement Learning (ICRA 2021)Robust, Visual-Inertial State Estimation: from Frame-based to Event-based CamerasVisual SLAM (indoor and outdoor)AVP-SLAM: Semantic Visual Mapping and Localization for Autonomous Vehicles in the Parking Lot[Open Source] VINS-Mobile: Monocular Visual-Inertial state estimation compared with Google TangoBenefits of Event-based Neuromorphic SensingHuman-Piloted Drone Racing: Visual Processing and Control (ICRA 2021 Video Pitch)Fusion4D: Real-time Performance Capture of Challenging ScenesReal-time Odometry-less 3D LiDAR SLAM with Generalized ICP and Pose-Graph Optimization (Long Ver.)