You, Y., Cheong, S., Chen, T. P., Chen, Y., Zhang, K., Acar, C., … Tee, K. P. (2021). State Estimation for Hybrid Wheeled-Legged Robots Performing Mobile Manipulation Tasks. 2021 IEEE International Conference on Robotics and Automation (ICRA). doi:10.1109/icra48506.2021.9560948
This paper introduces a general state estimation framework fusing multiple sensor information for hybrid wheeled-legged robots performing mobile manipulation tasks. At the core of the state estimator is a novel unified odometry for hybrid locomotion which can seamlessly maintain tracking and has no need to switch between stepping and rolling modes. To the best of our knowledge, the proposed odometry is the first work in this area. It is calculated based on the robot kinematics and instantaneous contact points of wheels with sensor inputs from IMU, joint encoders, joint torque sensors estimating wheel contact status, as well as RGB-D camera detecting geometric features of the terrain (e.g. elevation and surface normal vector). Subsequently, the odometry output is utilized as the motion model of a 3D Lidar map-based Monte Carlo Localization module for drift-free state estimation. As part of the framework, visual localization is integrated to provide high precision guidance for the robot movement relative to an object of interest. The proposed approach was verified thoroughly by two experiments conducted on the Pholus robot with OptiTrack measurements as ground truth.
There was no specific funding for the research done