Accurate motion estimation plays a crucial role in
state estimation of an unmanned aerial vehicle (UAV). This
is usually carried out by fusing the kinematics of an inertial
measurement unit (IMU) with the video output of a camera.
However, the accuracy of existing approaches is hindered by the
discretization effect of the model even at a high IMU sampling
rate. In order to improve the accuracy, we propose a new IMU
motion integration model for the IMU kinematics in continuous
time. The kinematics are modelled using a switched linear system.
A closed form discrete formulation is derived to compute the
mean measurement, the covariance matrix and the Jacobian
matrix. Thus, it is more accurate and more efficient for online
estimation of visual inertial odometry (VIO), particularly when
there is a high dynamic change in the agent’s motion or the agent
travels with high speed. The proposed IMU factor framework is
evaluated using both real public datasets and indoor environment
under different scenarios of motion capture. Our evaluation
shows that the proposed framework outperforms the state-ofthe-
art VIO approach by up to 22.71% accuracy improvement
on the EuRoc dataset and 38.15% accuracy improvement for
motion estimation under the indoor environment.
License type:
Funding Info:
This work was supported in part by the National
Robotics Programme (NRP) under SERC Grants 162 25 00036 and 192
25 00049.