Towards a Programming-Free Robotic System for Assembly Tasks Using Intuitive Interactions

Page view(s)
135
Checked on Apr 09, 2025
Towards a Programming-Free Robotic System for Assembly Tasks Using Intuitive Interactions
Title:
Towards a Programming-Free Robotic System for Assembly Tasks Using Intuitive Interactions
Journal Title:
Lecture Notes in Computer Science
Keywords:
Publication Date:
02 November 2021
Citation:
Gauthier, N., Liang, W., Xu, Q., Fang, F., Li, L., Gao, R., … Lim, J. H. (2021). Towards a Programming-Free Robotic System for Assembly Tasks Using Intuitive Interactions. Lecture Notes in Computer Science, 203–215. doi:10.1007/978-3-030-90525-5_18
Abstract:
Although industrial robots are successfully deployed in many assembly processes, high-mix, low-volume applications are still difficult to automate, as they involve small batches of frequently changing parts. Setting up a robotic system for these tasks requires repeated re-programming by expert users, incurring extra time and costs. In this paper, we present a solution which enables a robot to learn new objects and new tasks from non-expert users without the need for programming. The use case presented here is the assembly of a gearbox mechanism. In the proposed solution, first, the robot can autonomously register new objects using a visual exploration routine, and train a deep learning model for object detection accordingly. Secondly, the user can teach new tasks to the system via visual demonstration in a natural manner. Finally, using multimodal perception from RGB-D (color and depth) cameras and a tactile sensor, the robot can execute the taught tasks with adaptation to changing configurations. Depending on the task requirements, it can also activate human-robot collaboration capabilities. In summary, these three main modules enable any non-expert user to configure a robot for new applications in a fast and intuitive way.
License type:
Publisher Copyright
Funding Info:
This research / project is supported by the A*STAR - AME Programmatic Funding Scheme
Grant Reference no. : A18A2b0046
Description:
This is a post-peer-review, pre-copyedit version of an article published in Social Robotics. The final authenticated version is available online at: https://doi.org/10.1007/978-3-030-90525-5_18
ISSN:
1611-3349
0302-9743
Files uploaded: