Abstract
Deaf-mute communities around the world experience a need in effective human-robot interaction system that would act as an interpreter in public places such as banks, hospitals, or police stations. The focus of this work is to address the challenges presented to hearing-impaired people by developing an interpreting robotic system required for effective communication in public places. To this end, we utilize a previously developed neural network-based learning architecture to recognize Cyrillic manual alphabet, which is used for finger spelling in Kazakhstan. In order to train and test the performance of the recognition system, we collected a depth data set of ten people and applied it to a learning-based method for gesture recognition by modeling motion data. We report our results that show an average accuracy of 77.2% for a complete alphabet recognition consisting of 33 letters.
Original language | English |
---|---|
Title of host publication | FS-16-01 |
Subtitle of host publication | Artificial Intelligence for Human-Robot Interaction; FS-16-02: Cognitive Assistance in Government and Public Sector Applications; FS-16-03: Cross-Disciplinary Challenges for Autonomous Systems; FS-16-04: Privacy and Language Technologies; FS-16-05: Shared Autonomy in Research and Practice |
Publisher | AI Access Foundation |
Pages | 94-99 |
Number of pages | 6 |
Volume | FS-16-01 - FS-16-05 |
ISBN (Electronic) | 9781577357759 |
Publication status | Published - Jan 1 2016 |
Event | 2016 AAAI Fall Symposium - Arlington, United States Duration: Nov 17 2016 → Nov 19 2016 |
Other
Other | 2016 AAAI Fall Symposium |
---|---|
Country/Territory | United States |
City | Arlington |
Period | 11/17/16 → 11/19/16 |
ASJC Scopus subject areas
- General Engineering