TY - JOUR
T1 - Human–robot handover with prior-to-pass soft/rigid object classification via tactile glove
AU - Mazhitov, Ayan
AU - Syrymova, Togzhan
AU - Kappassov, Zhanat
AU - Rubagotti, Matteo
N1 - Funding Information:
The first two authors contributed equally. This work was funded by Nazarbayev University under Collaborative Research Project no. 091019CRP2118 and by the Ministry of Education and Science of the Republic of Kazakhstan under grant no. AP09058050 .
Publisher Copyright:
© 2022 Elsevier B.V.
PY - 2023/1
Y1 - 2023/1
N2 - Human–robot handovers constitute a challenging and fundamental aspect of physical human–robot interaction. This paper describes the design and implementation of a human–robot handover pipeline in the case in which both soft and rigid objects are passed by the human to the robot. These objects require different profiles of grasping torques by the robot hand fingers, so as to avoid damaging them. As a viable solution to this problem, a tactile glove worn by the human is used to provide real-time information to a deep neural network, which classifies each object as soft or rigid in the pre-handover phase: this information is passed to the robot, which applies the grasping torque profile suitable for the specific type of object. The proposed method is designed and validated based on experiments with eight human participants and 24 objects. The outcomes of these experiments regarding classification accuracy, force and torque profiles, and evaluation of the subjective experiences via questionnaires, are described and discussed.
AB - Human–robot handovers constitute a challenging and fundamental aspect of physical human–robot interaction. This paper describes the design and implementation of a human–robot handover pipeline in the case in which both soft and rigid objects are passed by the human to the robot. These objects require different profiles of grasping torques by the robot hand fingers, so as to avoid damaging them. As a viable solution to this problem, a tactile glove worn by the human is used to provide real-time information to a deep neural network, which classifies each object as soft or rigid in the pre-handover phase: this information is passed to the robot, which applies the grasping torque profile suitable for the specific type of object. The proposed method is designed and validated based on experiments with eight human participants and 24 objects. The outcomes of these experiments regarding classification accuracy, force and torque profiles, and evaluation of the subjective experiences via questionnaires, are described and discussed.
KW - Deep learning
KW - Human–robot handover
KW - Human–robot interaction
KW - Tactile sensors
UR - http://www.scopus.com/inward/record.url?scp=85142751220&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85142751220&partnerID=8YFLogxK
U2 - 10.1016/j.robot.2022.104311
DO - 10.1016/j.robot.2022.104311
M3 - Article
AN - SCOPUS:85142751220
SN - 0921-8890
VL - 159
JO - Robotics and Autonomous Systems
JF - Robotics and Autonomous Systems
M1 - 104311
ER -