J. T. Zhou, J. Du, H. Zhu, X. Peng, Y. Liu and R. S. M. Goh, "AnomalyNet: An Anomaly Detection Network for Video Surveillance," in IEEE Transactions on Information Forensics and Security, vol. 14, no. 10, pp. 2537-2550, Oct. 2019, doi: 10.1109/TIFS.2019.2900907.
Abstract:
Sparse coding based anomaly detection has shown promising performance, of which the keys are feature learning, sparse representation, and dictionary learning. In this work, we propose a new neural network for anomaly detection (termed AnomalyNet) by deeply achieving feature learning, sparse representation and dictionary learning in three joint neural processing blocks. Specifically, to learn better features, we design a motion fusion block accompanied by a feature transfer block to enjoy the advantages of eliminating noisy background, capturing motion and alleviating data deficiency. Furthermore, to address some disadvantages (e.g., nonadaptive updating) of existing sparse coding optimizers and embrace the merits of neural network (e.g., parallel computing), we design a novel recurrent neural network to learn sparse representation and dictionary by proposing an adaptive iterative hard-thresholding algorithm (adaptive ISTA) and reformulating the adaptive ISTA as a new long short term memory (LSTM). To the best of our knowledge, this could be one of first works to bridge the `1-solver and LSTM and may provide novel insight in understanding LSTM and model-based optimization (or named differentiable programming), as well as sparse coding based anomaly detection. Extensive experiments show the state-of-the-art performance of our method in the abnormal events detection task.
License type:
PublisherCopyrights
Funding Info:
This work was supported in part by the Singapore government’s Research, Innovation and Enterprise 2020 plan (Advanced Manufacturing and Engineering domain) through the Programmatic Grant under Grant A1687b0033, in part by the Fundamental Research Funds for the Central Universities under Grant YJ201748, in part by the NFSC under Grant 61806135 and Grant 61876211, in part by the NFSC for Distinguished Young Scholar under Grant 61625204, and in part by the Key Program of NFSC under Grant 61836006.