Efficient motion feature aggregation for optical flow via locality-sensitive hashing

Page view(s)
16
Checked on Aug 08, 2025
Efficient motion feature aggregation for optical flow via locality-sensitive hashing
Title:
Efficient motion feature aggregation for optical flow via locality-sensitive hashing
Journal Title:
Neurocomputing
Keywords:
Publication Date:
14 November 2024
Citation:
Chen, W., Liu, Z., Wu, X., Liu, Z., & Li, Z. (2025). Efficient motion feature aggregation for optical flow via locality-sensitive hashing. Neurocomputing, 616, 128870. https://doi.org/10.1016/j.neucom.2024.128870
Abstract:
Optical flow estimation aims to find the 2D motion field by identifying the corresponding pixels between two images. With the tremendous progress of deep learning, previous works rely on CNNs to regress the pixelwise correspondence. However, it is still challenging to estimate displacements at occlusions, where one point is imaged in the current frame but not in the next frame. To solve this problem, recent work proposes to aggregate motion features at a global scale. The aggregation allows motion information to be passed from non-occluded pixels to occluded pixels, which helps resolve ambiguities caused by occlusions. Though the global motion aggregation (GMA) works well, its computational complexity is quadratic to the input resolution. The complexity can be a limitation of GMA when applied to higher resolution. In this paper, we propose an efficient motion feature aggregation (EMFA) module, to approximate the original GMA at a much lower computational cost. The core insight of EMFA originates from a clustering scheme, where the motion information of one point is aggregated from the points in the same cluster. The clustering is implemented by a traditional hashing algorithm, called locality-sensitive hashing (LSH). Finally, we apply the EMFA module to the optical flow estimation at high resolution. Experimental results on publicly available datasets show that our method outperforms the state-of-the-art ones.
License type:
Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0)
Funding Info:
There was no specific funding for the research done
Description:
ISSN:
0925-2312
Files uploaded:

File Size Format Action
nc-accepted.pdf 2.12 MB PDF Request a copy