Haddeler, G., Palanivelu, H. P., Ng, Y. C., Colonnier, F., Adiwahono, A. H., Li, Z., Chew, C.-M., & Chuah, M. Y. (2022). Real-time Digital Double Framework to Predict Collapsible Terrains for Legged Robots. 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). https://doi.org/10.1109/iros47612.2022.9981613
Abstract:
Inspired by the digital twinning systems, a novel real-time digital double framework is developed to enhance robot perception of the terrain conditions. Based on the very same physical model and motion control, this work exploits the use of such simulated digital double synchronized with a real robot to capture and extract discrepancy information between the two systems, which provides high dimensional cues in multiple physical quantities to represent differences between the modelled and the real world. Soft, non-rigid terrains cause common failures in legged locomotion, whereby visual perception solely is insufficient in estimating such physical properties of terrains. We used digital double to develop the estimation of the collapsibility, which addressed this issue through physical interactions during dynamic walking. The discrepancy in sensory measurements between the real robot and its digital double are used as input of a learning-based algorithm for terrain collapsibility analysis. Although trained only in simulation, the learned model can perform collapsibility estimation successfully in both simulation and real world. Our evaluation of results showed the generalization to different scenarios and the advantages of the digital double to reliably detect nuances in ground conditions.
License type:
Publisher Copyright
Funding Info:
This research / project is supported by the A*STAR - Research, Innovation and Enterprise 2020 plan (RIE2020) - Advanced Manufacturing and Engineering (AME) domain
Grant Reference no. : A1687b0033
This research is supported by core funding from: Institute for Infocomm Research (I2R) under Robotics & Autonomous Systems Department
Grant Reference no. :