Human–robot handovers constitute a challenging and fundamental aspect of physical human–robot interaction. This paper describes the design and implementation of a human–robot handover pipeline in the case in which both soft and rigid objects are passed by the human to the robot. These objects require different profiles of grasping torques by the robot hand fingers, so as to avoid damaging them. As a viable solution to this problem, a tactile glove worn by the human is used to provide real-time information to a deep neural network, which classifies each object as soft or rigid in the pre-handover phase: this information is passed to the robot, which applies the grasping torque profile suitable for the specific type of object. The proposed method is designed and validated based on experiments with eight human participants and 24 objects. The outcomes of these experiments regarding classification accuracy, force and torque profiles, and evaluation of the subjective experiences via questionnaires, are described and discussed.
- Deep learning
- Human–robot handover
- Human–robot interaction
- Tactile sensors
ASJC Scopus subject areas
- Control and Systems Engineering
- Computer Science Applications