Jie Li
A multimodal human-robot sign language interaction framework applied in social robots
Li, Jie; Zhong, Junpei; Wang, Ning
Abstract
Deaf-mutes face many difficulties in daily interactions with hearing people through spoken language. Sign language is an important way of expression and communication for deaf-mutes. Therefore, breaking the communication barrier between the deaf-mute and hearing communities is significant for facilitating their integration into society. To help them integrate into social life better, we propose a multimodal Chinese sign language (CSL) gesture interaction framework based on social robots. The CSL gesture information including both static and dynamic gestures is captured from two different modal sensors. A wearable Myo armband and a Leap Motion sensor are used to collect human arm surface electromyography (sEMG) signals and hand 3D vectors, respectively. Two modalities of gesture datasets are preprocessed and fused to improve the recognition accuracy and to reduce the processing time cost of the network before sending it to the classifier. Since the input datasets of the proposed framework are temporal sequence gestures, the long-short term memory recurrent neural network is used to classify these input sequences. Comparative experiments are performed on an NAO robot to test our method. Moreover, our method can effectively improve CSL gesture recognition accuracy, which has potential applications in a variety of gesture interaction scenarios not only in social robots.
Journal Article Type | Article |
---|---|
Acceptance Date | Mar 20, 2023 |
Online Publication Date | Apr 11, 2023 |
Publication Date | Apr 11, 2023 |
Deposit Date | May 3, 2023 |
Publicly Available Date | May 3, 2023 |
Journal | Frontiers in Neuroscience |
Print ISSN | 1662-4548 |
Electronic ISSN | 1662-453X |
Publisher | Frontiers Media |
Peer Reviewed | Peer Reviewed |
Volume | 17 |
Article Number | 1168888 |
DOI | https://doi.org/10.3389/fnins.2023.1168888 |
Keywords | Neuroscience, social robots, sign language, gesture recognition, multimodal sensors, human-robot interaction |
Public URL | https://uwe-repository.worktribe.com/output/10723766 |
Publisher URL | https://www.frontiersin.org/articles/10.3389/fnins.2023.1168888/full |
Files
A multimodal human-robot sign language interaction framework applied in social robots
(5.1 Mb)
PDF
Licence
http://creativecommons.org/licenses/by/4.0/
Publisher Licence URL
http://creativecommons.org/licenses/by/4.0/
You might also like
Review on human-like robot manipulation using dexterous hands
(2023)
Journal Article
Downloadable Citations
About UWE Bristol Research Repository
Administrator e-mail: repository@uwe.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search