A Wearable Virtual Guide for Context-Aware Cognitive Indoor Navigation

A Wearable Virtual Guide for Context-Aware Cognitive Indoor Navigation
Title:
A Wearable Virtual Guide for Context-Aware Cognitive Indoor Navigation
Other Titles:
MobileHCI'14
Keywords:
Publication Date:
01 September 2014
Citation:
Qianli Xu, Liyuan Li, Joo Hwee Lim, Cheston Yin Chet Tan, Michal Mukawa, and Gang Wang. 2014. A wearable virtual guide for context-aware cognitive indoor navigation. In Proceedings of the 16th international conference on Human-computer interaction with mobile devices & services (MobileHCI '14). ACM, New York, NY, USA, 111-120
Abstract:
In this paper, we explore a new way to provide context-aware assistance for indoor navigation using a wearable vision system. We investigate how to represent the cognitive knowledge of wayfinding based on first-person-view videos in real-time and how to provide context-aware navigation instructions in a human-like manner. Inspired by the human cognitive process of wayfinding, we propose a novel cognitive model that represents visual concepts as a hierarchical structure. It facilitates efficient and robust localization based on cognitive visual concepts. Next, we design a prototype system that provides intelligent context- aware assistance based on the cognitive indoor navigation knowledge model. We conduct field tests to evaluate the system’s efficacy by benchmarking it against traditional 2D maps and human guidance. The results show that context- awareness built on cognitive visual perception enables the system to emulate the efficacy of a human guide, leading to positive user experience.
License type:
PublisherCopyrights
Funding Info:
Description:
ISBN:
978-1-4503-3004-6
Files uploaded:

File Size Format Action
mhci0246-xu.pdf 1.73 MB PDF Open