Scenes of the UM Campus seen through the eyes of a bipedal robot: InEKF LiDAR mapping on Cassie Blue
The pose of Cassie is being estimated using an invariant extended Kalman filter (InEKF) that combines inertial, contact, and forward kinematic measurements**. This filter exploits the geometry of the problem (using Lie groups) to eliminate linearization errors and reduce drift.
To create this LiDAR map, the past 10 seconds of data (100 point clouds) are projected into the estimated frame of the sensor. The camera image in the lower right corner gives you an idea of the lateral sway (roll motion) of the robot. The fact that the LiDAR map is (nearly) perfectly still is a testament to the quality of pose estimate produced by the InEKF.
**Ross Hartley, Maani Ghaffari Jadidi, Jessy W. Grizzle, and Ryan M. Eustice, Contact-Aided Invariant Extended Kalman Filtering for Legged Robot State Estimation , RSS 2018. The paper is available here: https://arxiv.org/abs/1805.10410 and the source code is here https://github.com/RossHartley/invariant-ekf
Sensors: joint encoders, VectorNav-100 inertial measurement unit, Velodyne VLP-32C
Cassie Blue is walking with control laws developed at Michigan in this publication: Yukai Gong, Ross Hartley, Xingye Da, Ayonga Hereid, Omar Harib, Jiunn-Kai Huang, and Jessy Grizzle, Feedback Control of a Cassie Bipedal Robot: Walking, Standing, and Riding a Segway Submitted September 2018; available here https://arxiv.org/abs/1809.07279
Why is Cassie walking a bit "gingerly"? We have not redesigned the controller to account for the new torso, which adds approx 10 Kg to the robot's 31 Kg body. That shows how robust Yukai's controller design is!
Cassie was built by Agility Robotics. The robot's purchase was enabled by funding from NSF Inspire Grant ECCS-1343720 and Toyota Research Institute (TRI). The work on the control law was funded by NSF Grant NRI-1525006.
Видео Scenes of the UM Campus seen through the eyes of a bipedal robot: InEKF LiDAR mapping on Cassie Blue канала Michigan Robotics: Dynamic Legged Locomotion Lab
To create this LiDAR map, the past 10 seconds of data (100 point clouds) are projected into the estimated frame of the sensor. The camera image in the lower right corner gives you an idea of the lateral sway (roll motion) of the robot. The fact that the LiDAR map is (nearly) perfectly still is a testament to the quality of pose estimate produced by the InEKF.
**Ross Hartley, Maani Ghaffari Jadidi, Jessy W. Grizzle, and Ryan M. Eustice, Contact-Aided Invariant Extended Kalman Filtering for Legged Robot State Estimation , RSS 2018. The paper is available here: https://arxiv.org/abs/1805.10410 and the source code is here https://github.com/RossHartley/invariant-ekf
Sensors: joint encoders, VectorNav-100 inertial measurement unit, Velodyne VLP-32C
Cassie Blue is walking with control laws developed at Michigan in this publication: Yukai Gong, Ross Hartley, Xingye Da, Ayonga Hereid, Omar Harib, Jiunn-Kai Huang, and Jessy Grizzle, Feedback Control of a Cassie Bipedal Robot: Walking, Standing, and Riding a Segway Submitted September 2018; available here https://arxiv.org/abs/1809.07279
Why is Cassie walking a bit "gingerly"? We have not redesigned the controller to account for the new torso, which adds approx 10 Kg to the robot's 31 Kg body. That shows how robust Yukai's controller design is!
Cassie was built by Agility Robotics. The robot's purchase was enabled by funding from NSF Inspire Grant ECCS-1343720 and Toyota Research Institute (TRI). The work on the control law was funded by NSF Grant NRI-1525006.
Видео Scenes of the UM Campus seen through the eyes of a bipedal robot: InEKF LiDAR mapping on Cassie Blue канала Michigan Robotics: Dynamic Legged Locomotion Lab
Показать
Комментарии отсутствуют
Информация о видео
14 ноября 2018 г. 17:56:33
00:02:01
Другие видео канала
Wandercraft at Viva Technology 24 May 2018Cassie Blue's POV when visting the First Robotics World Championships in Detroit on April 26, 2018Cassie vs The HillMARLO vs Wave Field: Round 2 (Blindfolded) [Long Version]Cassie on a 1.2 m/s Moving WalkwayFully Autonomy on the Wave Field 2021Walking and U Turn Near EECS BuildingBipedal Robot Rabbit: Falling when controller is turned off (Grenoble, March 2003)Nonholonomic Virtual Constraints and Gait Optimization for Robust Walking ControlTales of the EncoderBipedal Robot MABEL Traverses Uneven Terrain Using Touch Instead of Sight8 inch (20 cm) Blind Step-down Experiment on Bipedal Robot MABELControl Lyapunov Function (CLF-based) Feedback Controller on MABELTerrain-Adaptive, ALIP-Based Bipedal Locomotion Controller via MPC and Virtual Constraints-ShortBipedal Robot Rabbit: Six running steps (Grenoble, September 2004)MABEL Walking - Virtual Constraints designed by handCassie Autonomously Navigates in Four Long Corridors (200 meters)Second Practice Session For MARLO vs the Wave FieldBipedal Robot Rabbit: Extended Perturbation (Grenoble 3 Sept 2004)Walking Gait Optimization for Accommodation of Unknown Terrain Height Variations