3D Collision-Force-Map for Safe Human-Robot Collaboration
Svarny, P.; Rozlivek, J.; Rustler, L. & Hoffmann, M. (2021), 3D Collision-Force-Map for Safe Human-Robot Collaboration, in 'IEEE International Conference on Robotics and Automation (ICRA)', pp. 3829-3835.
https://doi.org/10.1109/ICRA48506.2021.9561845
https://arxiv.org/abs/2009.01036
The need to guarantee safety of collaborative robots limits their performance, in particular, their speed and hence cycle time. The standard ISO/TS 15066 defines the Power and Force Limiting operation mode and prescribes force thresholds that a moving robot is allowed to exert on human body parts during impact, along with a simple formula to obtain maximum allowed speed of the robot in the whole workspace. In this work, we measure the forces exerted by two collaborative manipulators (UR10e and KUKA LBR iiwa) moving downward against an impact measuring device. First, we empirically show that the impact forces can vary by more than 100 percent within the robot workspace. The forces are negatively correlated with the distance from the robot base and the height in the workspace. Second, we present a data-driven model, 3D Collision-Force-Map, predicting impact forces from distance, height, and velocity and demonstrate that it can be trained on a limited number of data points. Third, we analyze the force evolution upon impact and find that clamping never occurs for the UR10e. We show that formulas relating robot mass, velocity, and impact forces from ISO/TS 15066 are insufficient---leading both to significant underestimation and overestimation and thus to unnecessarily long cycle times or even dangerous applications. We propose an empirical method that can be deployed to quickly determine the optimal speed and position where a task can be safely performed with maximum efficiency.
Видео 3D Collision-Force-Map for Safe Human-Robot Collaboration канала Body representations: from humans to humanoids
https://doi.org/10.1109/ICRA48506.2021.9561845
https://arxiv.org/abs/2009.01036
The need to guarantee safety of collaborative robots limits their performance, in particular, their speed and hence cycle time. The standard ISO/TS 15066 defines the Power and Force Limiting operation mode and prescribes force thresholds that a moving robot is allowed to exert on human body parts during impact, along with a simple formula to obtain maximum allowed speed of the robot in the whole workspace. In this work, we measure the forces exerted by two collaborative manipulators (UR10e and KUKA LBR iiwa) moving downward against an impact measuring device. First, we empirically show that the impact forces can vary by more than 100 percent within the robot workspace. The forces are negatively correlated with the distance from the robot base and the height in the workspace. Second, we present a data-driven model, 3D Collision-Force-Map, predicting impact forces from distance, height, and velocity and demonstrate that it can be trained on a limited number of data points. Third, we analyze the force evolution upon impact and find that clamping never occurs for the UR10e. We show that formulas relating robot mass, velocity, and impact forces from ISO/TS 15066 are insufficient---leading both to significant underestimation and overestimation and thus to unnecessarily long cycle times or even dangerous applications. We propose an empirical method that can be deployed to quickly determine the optimal speed and position where a task can be safely performed with maximum efficiency.
Видео 3D Collision-Force-Map for Safe Human-Robot Collaboration канала Body representations: from humans to humanoids
Показать
Комментарии отсутствуют
Информация о видео
26 марта 2021 г. 14:15:15
00:02:19
Другие видео канала
![Active Visuo-Haptic Object Shape Completion - Video presentation](https://i.ytimg.com/vi/Qvv5BM4c9tE/default.jpg)
![Hey, robot! An investigation of getting robot's attention through touch](https://i.ytimg.com/vi/TI_oy6uO0Kw/default.jpg)
![Multisensorial robot calibration framework and toolbox](https://i.ytimg.com/vi/ZZHztHF6eNs/default.jpg)
![Motoman dual arm self-calibration by touching a plane](https://i.ytimg.com/vi/rd7c6T2HyMs/default.jpg)
![Nao with artificial skin - autonomous touch](https://i.ytimg.com/vi/V_4u-DoXVCo/default.jpg)
![Spatial calibration of whole-body artificial skin on a humanoid robot](https://i.ytimg.com/vi/CCa2OPDq-BY/default.jpg)
![Elasticity Estimation Using Standard Robot Grippers](https://i.ytimg.com/vi/8DMMdEezC1M/default.jpg)
![Toward safe separation distance monitoring from RGB-D sensors in human-robot interaction](https://i.ytimg.com/vi/3DZyuuQlqPo/default.jpg)
![Learning to reach to own body from spontaneous self-touch using a generative model](https://i.ytimg.com/vi/gg_qySTwOdc/default.jpg)
![Investigating personal spatial zones and proxemic behavior in human-robot interaction](https://i.ytimg.com/vi/gvICAkfK2CA/default.jpg)
![Nao in Gazebo - skin Emulation](https://i.ytimg.com/vi/7f9Mfl7sO18/default.jpg)
![Safe physical human-robot interaction](https://i.ytimg.com/vi/-TC4WsdJJAM/default.jpg)
![iCub Gazebo - Skin emulation](https://i.ytimg.com/vi/xccVWibz_KQ/default.jpg)
![Does electronic skin make collaborative robots even safer?](https://i.ytimg.com/vi/yqEjnK9_hCg/default.jpg)
![FEE CTU Prague iCub on Czech TV "Zpravicky", October 13, 2021. [in Czech]](https://i.ytimg.com/vi/fg42j2YTsNs/default.jpg)
![Self-touch and other spontaneous behavior patterns in early infancy](https://i.ytimg.com/vi/bpdchHNDtpw/default.jpg)
![iCub humanoid robot playing a card game with a human](https://i.ytimg.com/vi/gw8JB-1R3bs/default.jpg)
![iCub Gazebo - Skin emulation (v2)](https://i.ytimg.com/vi/RAdbk1a0JFY/default.jpg)
![Online Material Sorting From Single Grasp with a Robot Gripper](https://i.ytimg.com/vi/XHiNKFZ158o/default.jpg)
![Goal-directed tactile exploration for body model learning through self-touch on a humanoid robot](https://i.ytimg.com/vi/dnJaffBHf1c/default.jpg)