D. Araiza-lllan and A. de San Bernabé Clemente, "Dynamic Regions to Enhance Safety in Human-Robot Interactions," 2018 IEEE 23rd International Conference on Emerging Technologies and Factory Automation (ETFA), Turin, 2018, pp. 693-698. doi: 10.1109/ETFA.2018.8502453
The adoption of robots for collaborative tasks strongly depends on ensuring the safety of the operators, through both internal sensors for collision detection and protective stop, and external sensors to monitor human presence. External industrial safety sensors (e.g. laser scanners) are expensive, do not distinguish between a trained operator and a bystander, and the cycle time may be increased considerably as the robot stops or reduces its speed without considering the behaviour of the people around it in a more effective manner. We present a dynamic safety solution for human-robot collaboration that tracks human behaviour, based on on RGB-D cameras. Our solution updates in real-time the stopping and speed reducing areas of a robot, according to the robot's speed and the spatial relation between the robot and operators or bystanders. In our solution, the data from a commercial RGB-D camera provides more information about the people in the space, compared to industrial safety sensors. The presented solution is first evaluated through a case study of a collaborative robot in a pick-and-place task in a manufacturing workshop. Then, the solution is compared with off-the-shelf industrial safety sensors. Finally, we characterize the system's capabilities experimentally. The results indicate that using the proposed system significantly reduces the total average cycle time of the task, compared to traditional industrial safety setups.