Towards Sign Language Interpreting Robotic System

Nazerke Kalidolda, Anara Sandygulova

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Hearing-impaired communities around the world communicate via a sign language. The focus of this work is to develop an interpreting human-robot interaction system that could act as a sign language interpreter in public places. To this end, we utilize a number of technologies including depth cameras (a leap motion sensor and Microsoft Kinect), humanoid robots NAO and Pepper, and deep learning approaches for classification.

Original languageEnglish
Title of host publicationHRI 2018 - Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction
PublisherIEEE Computer Society
Number of pages1
ISBN (Electronic)9781450356152
DOIs
Publication statusPublished - Mar 1 2018
Event13th Annual ACM/IEEE International Conference on Human Robot Interaction, HRI 2018 - Chicago, United States
Duration: Mar 5 2018Mar 8 2018

Conference

Conference13th Annual ACM/IEEE International Conference on Human Robot Interaction, HRI 2018
CountryUnited States
CityChicago
Period3/5/183/8/18

Keywords

  • human-robot interaction
  • sign language recognition
  • social robotics

ASJC Scopus subject areas

  • Artificial Intelligence
  • Human-Computer Interaction
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Towards Sign Language Interpreting Robotic System'. Together they form a unique fingerprint.

Cite this