Bhujel, N., Yun, Y. W., Wang, H., & Dwivedi, V. P. (2021). Self-critical Learning of Influencing Factors for Trajectory Prediction using Gated Graph Convolutional Network. 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). doi:10.1109/iros51168.2021.9636641
Abstract:
Forecasting future trajectories of multiple pedestrians in a crowded environment is a challenging problem due to the complex interactions among the pedestrians. The interactions can be asymmetric and their influences may vary over time. Moreover, each pedestrian can exhibit different behavior at any given time and context. Their trajectories can have multiple possible futures. In this work, we present a Gated Graph Convolutional Network (GatedGCN) based trajectory prediction model that explicitly deal with the asymmetric influences among the adjacent pedestrians through edge-wise gating mechanism. Through GatedGCN only, an overall average improvement of 16% and 18% was achieved on the two performance metrics over the state-of-the-art trajectory forecasting methods. Next, we tackle the problem of learning multi-modal nature of each pedestrian trajectory using variational autoencoders (VAE). Although variational auto-encoders have been shown to be powerful for generating a multi-modal trajectory distribution, trajectories sampled from the learned distribution usually ignore influencing factors of pedestrian motion such as collision avoidance and final destination. While many of the existing approaches focus on learning such factors during the trajectory encoding process, we proposed a novel selfcritical learning approach based on Actor-Critic framework to learn influencing factors of pedestrian motion in the trajectory
generation process as well. We empirically show that our method creates fewer number of collisions than the existing methods on popular trajectory forecasting benchmarks.
License type:
Publisher Copyright
Funding Info:
This research / project is supported by the Agency for Science, Technology and Research - National Robotics Programme, SERC grant
Grant Reference no. : 162 25 00036