Describing Lifelogs with Convolutional Neural Networks: a Comparative Study

Describing Lifelogs with Convolutional Neural Networks: a Comparative Study
Title:
Describing Lifelogs with Convolutional Neural Networks: a Comparative Study
Other Titles:
LTA '16 Proceedings of the first Workshop on Lifelogging Tools and Applications
Keywords:
Publication Date:
16 October 2016
Citation:
Abstract:
Life-logging technologies, e.g. wearable cameras taking pictures at a fixed interval, can be used as a means of memory preservation (in digital form), caregiver monitoring and even cognitive therapy to train our brains. Yet, such large amount of data needs to be processed and edited to be of use. Automatic summarization of the life-logs into short story boards is a possible solution. But how good are these summaries? Are the selected key-frames informative and representative enough as to be good memory cues? The proposed approach (i) filters uninformative images by analyzing their ratio of edges and (ii) describes the images using the available Convolutional Neural Networks (CNN) models for objects and places with egocentric-driven data augmentation. We perform a comparative study to evaluate different summarization methods in terms of coverage, informativeness and representativeness in two different datasets, both with annotated ground truth and an on-line user study. Results show that filtering uninformative images improves the user satisfaction: users would request to change less frames from the original summary than without filtering. Moreover, the proposed egocentric image descriptor generates more diverse content than the standard cropping strategy used by most CNN-based approaches.
License type:
PublisherCopyrights
Funding Info:
Description:
© ACM 2016. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in LTA '16 Proceedings of the first Workshop on Lifelogging Tools and Applications, http://dx.doi.org/10.1145/2983576.2983579.
ISBN:
978-1-4503-4517-0
Files uploaded: