Self-Teaching Strategy for Learning to Recognize Novel Objects in Collaborative Robots

Self-Teaching Strategy for Learning to Recognize Novel Objects in Collaborative Robots
Title:
Self-Teaching Strategy for Learning to Recognize Novel Objects in Collaborative Robots
Other Titles:
ICRAI '19: Proceedings of the 2019 5th International Conference on Robotics and Artificial Intelligence
Publication Date:
23 November 2019
Citation:
Abstract:
Collaborative robot (cobot) is designed to be deployed to different tasks flexibly. For a new task, it is necessary to train the cobot to detect and recognize novel objects. Using dominant object detector based on Faster R-CNN, a user has to train it using a large number of manually annotated samples, which is inefficient and expensive. In this paper, we propose a self-teaching strategy for a cobot to learn to recognize novel objects efficiently and effectively. Like human-to-human teaching, the user just provides a few examples of a novel object captured by an RGB-D camera. The cobot obtains the ground truth annotation of the object automatically through depth segmentation. To achieve robust performance of object detection in real-world scenes, it generates augmented training samples by virtually placing the object in various backgrounds with changing scales and orientations (2D augmentation), and variations of viewpoints through projective transformation (3D augmentation). A state-of-the-art Faster R- CNN is re-trained and evaluated on real-world scenarios for a task of gearbox assembly. The comparison with conventional training approaches shows the superiority of the proposed approach in terms of efficiency and robustness for novel object detection.
License type:
Funding Info:
Agency for Science, Technology and Research (A*STAR) under its AME Programmatic Funding Scheme (Project#A18A2b0046).
Description:
The full paper can be downloaded from the publisher's URL here: https://doi.org/10.1145/3373724.3373732
ISBN:

Files uploaded:
File Size Format Action
There are no attached files.