TY - GEN
T1 - Real-time gesture recognition for the high-level teleoperation interface of a mobile manipulator
AU - Khassanov, Yerbolat
AU - Imanberdiyev, Nursultan
AU - Varol, Huseyin Atakan
N1 - Copyright:
Copyright 2014 Elsevier B.V., All rights reserved.
PY - 2014
Y1 - 2014
N2 - This paper describes an inertial motion capture based arm gesture recognition system for the high-level control of a mobile manipulator. Left arm kinematic data of the user is acquired by an inertial motion capture system (Xsens MVN) in real-time and processed to extract supervisory user interface commands such as "Manipulator On/Off", "Base On/Off" and "Operation Pause/Resume" for a mobile manipulator system (KUKA youBot). Principal Component Analysis and Linear Discriminant Analysis are employed for dimension reduction and classification of the user kinematic data, respectively. The classification accuracy for the six class gesture recognition problem is 95.6 percent. In order to increase the reliability of the gesture recognition framework in real-time operation, a consensus voting scheme involving the last ten classification results is implemented. During the five-minute long teleoperation experiment, a total of 25 high-level commands were recognized correctly by the consensus voting enhanced gesture recognizer. The experimental subject stated that the user interface was easy to learn and did not require extensive mental effort to operate.
AB - This paper describes an inertial motion capture based arm gesture recognition system for the high-level control of a mobile manipulator. Left arm kinematic data of the user is acquired by an inertial motion capture system (Xsens MVN) in real-time and processed to extract supervisory user interface commands such as "Manipulator On/Off", "Base On/Off" and "Operation Pause/Resume" for a mobile manipulator system (KUKA youBot). Principal Component Analysis and Linear Discriminant Analysis are employed for dimension reduction and classification of the user kinematic data, respectively. The classification accuracy for the six class gesture recognition problem is 95.6 percent. In order to increase the reliability of the gesture recognition framework in real-time operation, a consensus voting scheme involving the last ten classification results is implemented. During the five-minute long teleoperation experiment, a total of 25 high-level commands were recognized correctly by the consensus voting enhanced gesture recognizer. The experimental subject stated that the user interface was easy to learn and did not require extensive mental effort to operate.
KW - Gesture recognition
KW - Human-robot interaction
KW - Inertial motion capture
KW - Mobile manipulator
KW - Teleoperation
UR - http://www.scopus.com/inward/record.url?scp=84896911298&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84896911298&partnerID=8YFLogxK
U2 - 10.1145/2559636.2563712
DO - 10.1145/2559636.2563712
M3 - Conference contribution
AN - SCOPUS:84896911298
SN - 9781450326582
T3 - ACM/IEEE International Conference on Human-Robot Interaction
SP - 204
EP - 205
BT - HRI 2014 - Proceedings of the 2014 ACM/IEEE International Conference on Human-Robot Interaction
PB - IEEE Computer Society
T2 - 9th Annual ACM/IEEE International Conference on Human-Robot Interaction, HRI 2014
Y2 - 3 March 2014 through 6 March 2014
ER -