Gesture-based attention direction for a telepresence robot: Design and experimental study

Page view(s)
31
Checked on Oct 25, 2024
Gesture-based attention direction for a telepresence robot: Design and experimental study
Title:
Gesture-based attention direction for a telepresence robot: Design and experimental study
Journal Title:
2014 IEEE/RSJ International Conference on Intelligent Robots and Systems
Publication Date:
14 September 2014
Citation:
K. P. Tee, R. Yan, Y. Chua, Z. Huang and S. Liemhetcharat, "Gesture-based attention direction for a telepresence robot: Design and experimental study," 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago, IL, 2014, pp. 4090-4095. doi: 10.1109/IROS.2014.6943138
Abstract:
The application of robotics to telepresence can enhance user interaction experience by providing embodiment, engaging behaviors, automatic control, and human perception. This paper presents a new telepresence robot with gesture-based attention direction to orient the robot towards attention targets according to human deictic gestures. Gesture-based attention direction is realized by combining Localist Attractor Network (LAN) and Short-Term Memory (STM).We also propose audio-visual fusion based on context-dependent prioritization among the 3 types of audio-visual cues (gesture, speech source location, head location). Experiment results are very promising and show that i) the average gesture recognition rate is 92%, i) gesture-based attention direction rate is 90%, and that ii) only by considering the 3 types of audio-visual cues together can the robot perform on par with a human in directing attention to the correct person in a meeting scenario.
License type:
PublisherCopyrights
Funding Info:
Description:
(c) 2014 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.
ISSN:
2153-0858
2153-0866
978-1-4799-6934-0
978-1-4799-6931-9
Files uploaded: