Discrete Task-Space Automatic Curriculum Learning for Robotic Grasping

Page view(s)
34
Checked on Jan 16, 2025
Discrete Task-Space Automatic Curriculum Learning for Robotic Grasping
Title:
Discrete Task-Space Automatic Curriculum Learning for Robotic Grasping
Journal Title:
2021 21st International Conference on Control, Automation and Systems (ICCAS)
Keywords:
Publication Date:
28 December 2021
Citation:
Kurkcu, A., Acar, C., Campolo, D., & Tee, K. P. (2021). Discrete Task-Space Automatic Curriculum Learning for Robotic Grasping. 2021 21st International Conference on Control, Automation and Systems (ICCAS). doi:10.23919/iccas52745.2021.9649917
Abstract:
Deep reinforcement learning algorithms struggle in the domain of robotics where data collection is time consuming and in some cases safety-constrained. For sample-efficiency, curriculum learning has shown good results in deep learning-based methods. However, the issue lies on the generation of the curriculum itself, which the field of automatic curriculum learning is trying to solve. We present an automatic curriculum learning algorithm for discrete task-space scenarios. Our curriculum generation is based on difficulty measure between tasks and learning progress metric within a task. We apply our algorithm to a grasp learning problem involving 49 diverse objects. Our results show that a policy trained based on a curriculum is both sample efficient compared to learning from scratch and able to learn tasks that the latter could not learn within a reasonable amount of time.
License type:
Publisher Copyright
Funding Info:
This research / project is supported by the A*STAR - RIE 2020 - Advanced Manufacturing and Engineering
Grant Reference no. : A19E4a0101
Description:
© 2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
ISSN:
978-89-93215-21-2
Files uploaded: