Unconstrained ego-centric videos with eye-tracking data

Page view(s)
25
Checked on Nov 27, 2024
Unconstrained ego-centric videos with eye-tracking data
Title:
Unconstrained ego-centric videos with eye-tracking data
Journal Title:
IEEE Conference on Computer Vision and Pattern Recognition Workshops
DOI:
Publication URL:
Publication Date:
01 June 2015
Citation:
Abstract:
We present the first eye-tracking dataset for unconstrained ego-centric videos. The dataset captures over 6 hours of subjects performing common daily activities. These activities are manually annotated as socializing, walking, object manipulating, transiting and observing.
License type:
PublisherCopyrights
Funding Info:
Reverse Engineering Visual Intelligence for cognitiVe Enhancement (REVIVE) programme funded by the Joint Council Office (JCO) of A*STAR. Grant No: 1335h00098
Description:
(c) 2015 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.
ISBN:

Files uploaded:

File Size Format Action
18-ma-sunw.pdf 115.49 KB PDF Open