Towards Interpreting Robotic System for Fingerspelling Recognition in Real Time

Nazerke Kalidolda, Anara Sandygulova

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Hearing-impaired communities around the world communicate via a sign language. The focus of this work is to develop an interpreting human-robot interaction system that could act as a sign language interpreter in public places. This paper presents an ongoing work, which aims to recognize fingerspelling gestures in real time. To this end, we utilize a deep learning method for classification of 33 gestures used for fingerspelling by the local deaf-mute community. In order to train and test the performance of the recognition system, we utilize previously collected dataset of motion RGB-D data of 33 manual gestures. After applying it to a deep learning method, we achieved an offline result of an average accuracy of 75% for a complete alphabet recognition. In real time, the result was only 24.72%. In addition, we integrated a form of auto-correction in order to perform spell-checking on the recognized letters. Among 35 tested words, four words were recognized correctly (11.4%). Finally, we conducted an exploratory study inviting ten deaf individuals to interact with our sign language interpreting robotic system.

Original languageEnglish
Title of host publicationHRI 2018 - Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction
PublisherIEEE Computer Society
Pages141-142
Number of pages2
ISBN (Electronic)9781450356152
DOIs
Publication statusPublished - Mar 1 2018
Event13th Annual ACM/IEEE International Conference on Human Robot Interaction, HRI 2018 - Chicago, United States
Duration: Mar 5 2018Mar 8 2018

Conference

Conference13th Annual ACM/IEEE International Conference on Human Robot Interaction, HRI 2018
CountryUnited States
CityChicago
Period3/5/183/8/18

Fingerprint

Robotics
Human robot interaction
Audition
Deep learning

Keywords

  • human-robot interaction
  • sign language recognition
  • social robotics

ASJC Scopus subject areas

  • Artificial Intelligence
  • Human-Computer Interaction
  • Electrical and Electronic Engineering

Cite this

Kalidolda, N., & Sandygulova, A. (2018). Towards Interpreting Robotic System for Fingerspelling Recognition in Real Time. In HRI 2018 - Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction (pp. 141-142). IEEE Computer Society. https://doi.org/10.1145/3173386.3177085

Towards Interpreting Robotic System for Fingerspelling Recognition in Real Time. / Kalidolda, Nazerke; Sandygulova, Anara.

HRI 2018 - Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction. IEEE Computer Society, 2018. p. 141-142.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Kalidolda, N & Sandygulova, A 2018, Towards Interpreting Robotic System for Fingerspelling Recognition in Real Time. in HRI 2018 - Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction. IEEE Computer Society, pp. 141-142, 13th Annual ACM/IEEE International Conference on Human Robot Interaction, HRI 2018, Chicago, United States, 3/5/18. https://doi.org/10.1145/3173386.3177085
Kalidolda N, Sandygulova A. Towards Interpreting Robotic System for Fingerspelling Recognition in Real Time. In HRI 2018 - Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction. IEEE Computer Society. 2018. p. 141-142 https://doi.org/10.1145/3173386.3177085
Kalidolda, Nazerke ; Sandygulova, Anara. / Towards Interpreting Robotic System for Fingerspelling Recognition in Real Time. HRI 2018 - Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction. IEEE Computer Society, 2018. pp. 141-142
@inproceedings{6a4c5c26878c44f5824271f25da5b2c6,
title = "Towards Interpreting Robotic System for Fingerspelling Recognition in Real Time",
abstract = "Hearing-impaired communities around the world communicate via a sign language. The focus of this work is to develop an interpreting human-robot interaction system that could act as a sign language interpreter in public places. This paper presents an ongoing work, which aims to recognize fingerspelling gestures in real time. To this end, we utilize a deep learning method for classification of 33 gestures used for fingerspelling by the local deaf-mute community. In order to train and test the performance of the recognition system, we utilize previously collected dataset of motion RGB-D data of 33 manual gestures. After applying it to a deep learning method, we achieved an offline result of an average accuracy of 75{\%} for a complete alphabet recognition. In real time, the result was only 24.72{\%}. In addition, we integrated a form of auto-correction in order to perform spell-checking on the recognized letters. Among 35 tested words, four words were recognized correctly (11.4{\%}). Finally, we conducted an exploratory study inviting ten deaf individuals to interact with our sign language interpreting robotic system.",
keywords = "human-robot interaction, sign language recognition, social robotics",
author = "Nazerke Kalidolda and Anara Sandygulova",
year = "2018",
month = "3",
day = "1",
doi = "10.1145/3173386.3177085",
language = "English",
pages = "141--142",
booktitle = "HRI 2018 - Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction",
publisher = "IEEE Computer Society",
address = "United States",

}

TY - GEN

T1 - Towards Interpreting Robotic System for Fingerspelling Recognition in Real Time

AU - Kalidolda, Nazerke

AU - Sandygulova, Anara

PY - 2018/3/1

Y1 - 2018/3/1

N2 - Hearing-impaired communities around the world communicate via a sign language. The focus of this work is to develop an interpreting human-robot interaction system that could act as a sign language interpreter in public places. This paper presents an ongoing work, which aims to recognize fingerspelling gestures in real time. To this end, we utilize a deep learning method for classification of 33 gestures used for fingerspelling by the local deaf-mute community. In order to train and test the performance of the recognition system, we utilize previously collected dataset of motion RGB-D data of 33 manual gestures. After applying it to a deep learning method, we achieved an offline result of an average accuracy of 75% for a complete alphabet recognition. In real time, the result was only 24.72%. In addition, we integrated a form of auto-correction in order to perform spell-checking on the recognized letters. Among 35 tested words, four words were recognized correctly (11.4%). Finally, we conducted an exploratory study inviting ten deaf individuals to interact with our sign language interpreting robotic system.

AB - Hearing-impaired communities around the world communicate via a sign language. The focus of this work is to develop an interpreting human-robot interaction system that could act as a sign language interpreter in public places. This paper presents an ongoing work, which aims to recognize fingerspelling gestures in real time. To this end, we utilize a deep learning method for classification of 33 gestures used for fingerspelling by the local deaf-mute community. In order to train and test the performance of the recognition system, we utilize previously collected dataset of motion RGB-D data of 33 manual gestures. After applying it to a deep learning method, we achieved an offline result of an average accuracy of 75% for a complete alphabet recognition. In real time, the result was only 24.72%. In addition, we integrated a form of auto-correction in order to perform spell-checking on the recognized letters. Among 35 tested words, four words were recognized correctly (11.4%). Finally, we conducted an exploratory study inviting ten deaf individuals to interact with our sign language interpreting robotic system.

KW - human-robot interaction

KW - sign language recognition

KW - social robotics

UR - http://www.scopus.com/inward/record.url?scp=85045263087&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85045263087&partnerID=8YFLogxK

U2 - 10.1145/3173386.3177085

DO - 10.1145/3173386.3177085

M3 - Conference contribution

AN - SCOPUS:85045263087

SP - 141

EP - 142

BT - HRI 2018 - Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction

PB - IEEE Computer Society

ER -