Real-Time Visual Localisation and Mapping with a Single Camera
In my work over the past five years I have generalised the Simultaneous Localisation and Mapping (SLAM) methodology of sequential probabilistic mapping, developed to enable mobile robots to navigate in unknown environments, to demonstrate real-time 3D motion estimation and visual scene mapping with an agile single camera. Via my MonoSLAM algorithm, a webcam attached to a laptop becomes a low cost but high-performance position and mapping sensor, which can be used by advanced mobile robots or coupled to a wearable computer for personal localisation. When hard real-time performance (e.g. 30Hz+) is required, the limited processing resources of practical computers mean that fundamental issues of uncertainty propagation and selective visual attention must be addressed via the rigorous application of methods from probabilistic inference. The result is an approach which harnesses background knowledge of the scenario with the aim of obtaining maximum value from visual processing. I will present recent advances in information-theoretic active search, mosaicing, feature initialization and surface estimation, and compare my work with other state of the art approaches to real-time tracking and mapping. Practially, this technology has a host of interesting potential applications in many areas of robotics, wearable computing and augmented reality. The presentation will include a live demonstration.
Видео Real-Time Visual Localisation and Mapping with a Single Camera канала Microsoft Research
Видео Real-Time Visual Localisation and Mapping with a Single Camera канала Microsoft Research
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
![SLAM-Course - 01 - Introduction to Robot Mapping (2013/14; Cyrill Stachniss)](https://i.ytimg.com/vi/wVsfCnyt5jA/default.jpg)
![](https://i.ytimg.com/vi/ITwtG8P4baM/default.jpg)
![Aerial Photogrammetry Explained - Create 3D Models With Drone Photos](https://i.ytimg.com/vi/Blr3suSQt-Q/default.jpg)
![Imagineye live on-set VFX supervisor tools](https://i.ytimg.com/vi/3J28eaHyy18/default.jpg)
![Project Unknown: Autonomous Quadcopter - RPLiDAR Hector SLAM (2D Mapping)](https://i.ytimg.com/vi/gy9cDuaNW1w/default.jpg)
![Navion: An Energy-Efficient Visual-Inertial Odometry Accelerator for Micro Robotics](https://i.ytimg.com/vi/4W3UwaPW3hI/default.jpg)
![Open-source SLAM with Intel RealSense depth cameras](https://i.ytimg.com/vi/tcJHnHpwCXk/default.jpg)
![Lecture 8.2: John Leonard - Mapping, Localization and Self Driving Vehicles](https://i.ytimg.com/vi/1kel8U86EVE/default.jpg)
![Best Practices for Passive RFID Asset Tracking](https://i.ytimg.com/vi/anbe-pnCAwM/default.jpg)
![Deep Visual SLAM Frontends: SuperPoint, SuperGlue, and SuperMaps (#CVPR2020 Invited Talk)](https://i.ytimg.com/vi/u7Yo5EtOATQ/default.jpg)
![Keyframe-based SLAM for hand-held Augmented Reality](https://i.ytimg.com/vi/UPLROIlyBWs/default.jpg)
![[F1/10 (F1tenth) Lectures] Simultaneous Localization and Mapping - SLAM](https://i.ytimg.com/vi/44wwTIJqG_I/default.jpg)
![Distance to objects using single vision camera.](https://i.ytimg.com/vi/Qm7vunJAtKY/default.jpg)
![General Relativity Lecture 1](https://i.ytimg.com/vi/JRZgW1YjCKk/default.jpg)
![Simultaneous Localization And Mapping (SLAM)](https://i.ytimg.com/vi/MxuBLW8hmRY/default.jpg)
![Camera - IMU based Localization](https://i.ytimg.com/vi/zFRuy7ADThY/default.jpg)
![Graph-based SLAM using Pose Graphs (Cyrill Stachniss, 2020)](https://i.ytimg.com/vi/uHbRKvD8TWg/default.jpg)
![Visual and LIDAR based SLAM with ROS using Bittle and Raspberry Pi](https://i.ytimg.com/vi/uXpQUIF_Jyk/default.jpg)
![Animated GIFs and Space vs Time - Computerphile](https://i.ytimg.com/vi/blSzwPcL5Dw/default.jpg)
![Visual Odometry](https://i.ytimg.com/vi/nO_y6BRBBOg/default.jpg)