SLIRS: Sign language interpreting system for human-robot interaction

Nazgul Tazhigaliyeva, Yerniyaz Nurgabylov, German I. Parisi, Anara Sandygulova

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Citations (Scopus)


Deaf-mute communities around the world experience a need in effective human-robot interaction system that would act as an interpreter in public places such as banks, hospitals, or police stations. The focus of this work is to address the challenges presented to hearing-impaired people by developing an interpreting robotic system required for effective communication in public places. To this end, we utilize a previously developed neural network-based learning architecture to recognize Cyrillic manual alphabet, which is used for finger spelling in Kazakhstan. In order to train and test the performance of the recognition system, we collected a depth data set of ten people and applied it to a learning-based method for gesture recognition by modeling motion data. We report our results that show an average accuracy of 77.2% for a complete alphabet recognition consisting of 33 letters.

Original languageEnglish
Title of host publicationFS-16-01
Subtitle of host publicationArtificial Intelligence for Human-Robot Interaction; FS-16-02: Cognitive Assistance in Government and Public Sector Applications; FS-16-03: Cross-Disciplinary Challenges for Autonomous Systems; FS-16-04: Privacy and Language Technologies; FS-16-05: Shared Autonomy in Research and Practice
PublisherAI Access Foundation
Number of pages6
VolumeFS-16-01 - FS-16-05
ISBN (Electronic)9781577357759
Publication statusPublished - Jan 1 2016
Event2016 AAAI Fall Symposium - Arlington, United States
Duration: Nov 17 2016Nov 19 2016


Other2016 AAAI Fall Symposium
Country/TerritoryUnited States

ASJC Scopus subject areas

  • Engineering(all)


Dive into the research topics of 'SLIRS: Sign language interpreting system for human-robot interaction'. Together they form a unique fingerprint.

Cite this