Summarization of Egocentric Videos: A Comprehensive Survey

Summarization of Egocentric Videos: A Comprehensive Survey
Summarization of Egocentric Videos: A Comprehensive Survey
Other Titles:
IEEE Transactions on Human-Machine Systems
Publication Date:
01 February 2017
A. G. del Molino, C. Tan, J. H. Lim and A. H. Tan, "Summarization of Egocentric Videos: A Comprehensive Survey," in IEEE Transactions on Human-Machine Systems, vol. 47, no. 1, pp. 65-76, Feb. 2017. doi: 10.1109/THMS.2016.2623480
The introduction of wearable video cameras (e.g., GoPro) in the consumer market has promoted video life-logging, motivating users to generate large amounts of video data. This increasing flow of first-person video has led to a growing need for automatic video summarization adapted to the characteristics and applications of egocentric video. With this paper, we provide the first comprehensive survey of the techniques used specifically to summarize egocentric videos. We present a framework for first-person view summarization and compare the segmentation methods and selection algorithms used by the related work in the literature. Next, we describe the existing egocentric video datasets suitable for summarization and, then, the various evaluation methods. Finally, we analyze the challenges and opportunities in the field and propose new lines of research.
License type:
Funding Info:
(c) 2017 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.
Files uploaded: