WiHGR: A Robust WiFi-Based Human Gesture Recognition System via Sparse Recovery and Modified Attention-based BGRU

Page view(s)
86
Checked on Nov 24, 2024
WiHGR: A Robust WiFi-Based Human Gesture Recognition System via Sparse Recovery and Modified Attention-based BGRU
Title:
WiHGR: A Robust WiFi-Based Human Gesture Recognition System via Sparse Recovery and Modified Attention-based BGRU
Journal Title:
IEEE Internet of Things Journal
Publication Date:
25 October 2021
Citation:
Meng, W., Chen, X., Cui, W., & Guo, J. (2021). WiHGR: A Robust WiFi-Based Human Gesture Recognition System via Sparse Recovery and Modified Attention-based BGRU. IEEE Internet of Things Journal, 1–1. https://doi.org/10.1109/jiot.2021.3122435
Abstract:
Gesture recognition is an essential part in the field of human-computer interaction (HCI) and Internet of things system. Compared with the existing technologies based on wearable sensors and dedicated devices, approaches using WiFi channel state information (CSI) signals are more desirable for passive and fine-grained gesture recognition. However, the existing CSI-based gesture recognition systems usually suffer from high model complexity and low accuracy caused by environmental dynamics. To address these issues, we propose a robust gesture recognition system (WiHGR) in this paper. The WiHGR starts with a sparse recovery method to find the dominant paths from the multipath effect introduced by orthogonal frequency division multiplexing (OFDM) technology, i.e., the main propagation paths disturbed by a human gesture. Then, the phase difference matrix is constructed according to the phase differences between two adjacent receiving antennas from the dominant paths. We propose a modified attention based bi-directional gate recurrent unit (ABGRU) network to learn and extract discriminative features automatically from the phase difference matrix. The proposed attention mechanism assigns higher weights to the more important features, thus achieving a better recognition performance. Experimental results show that the WiHGR not only has a high accuracy for gesture recognition in the training environment, but also has a remarkable performance in new environment settings without retraining.
License type:
Publisher Copyright
Funding Info:
There was no specific funding for the research done
Description:
© 2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works
ISSN:
2372-2541
2327-4662
Files uploaded: