Abstract
Hearing-impaired communities around the world communicate via a sign language. The focus of this work is to develop an interpreting human-robot interaction system that could act as a sign language interpreter in public places. To this end, we utilize a number of technologies including depth cameras (a leap motion sensor and Microsoft Kinect), humanoid robots NAO and Pepper, and deep learning approaches for classification.
Original language | English |
---|---|
Title of host publication | HRI 2018 - Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction |
Publisher | IEEE Computer Society |
Number of pages | 1 |
ISBN (Electronic) | 9781450356152 |
DOIs | |
Publication status | Published - Mar 1 2018 |
Event | 13th Annual ACM/IEEE International Conference on Human Robot Interaction, HRI 2018 - Chicago, United States Duration: Mar 5 2018 → Mar 8 2018 |
Conference
Conference | 13th Annual ACM/IEEE International Conference on Human Robot Interaction, HRI 2018 |
---|---|
Country | United States |
City | Chicago |
Period | 3/5/18 → 3/8/18 |
Keywords
- human-robot interaction
- sign language recognition
- social robotics
ASJC Scopus subject areas
- Artificial Intelligence
- Human-Computer Interaction
- Electrical and Electronic Engineering