Event-based Robot Skin for Intelligent Robot – Environment Physical Interaction

  • Kappassov, Zhanat (PI)
  • Syrymova, Togzhan (Other Faculty/Researcher)
  • Baimukashev, Daulet (Other Faculty/Researcher)
  • Zhakatayev, Altay (Other Faculty/Researcher)
  • Adilkhanov, Adilzhan (Other Faculty/Researcher)

Project: MES RK

Project Details

Grant Program

Young researchers 2021-2023
Ministry of Education and Science of the Republic of Kazakhstan

Project Description

The main objective of the project is to incorporate the proposed event-based sensor signals into a contact-driven manipulation to enhance industrial robots for physical object exploration and manipulation tasks.

Project Relevance

The event-driven tactile robot skin for intelligent physical interaction with the environment (E-Skin) is a project based on the need for physical interaction that will create an anthropomorphic robotic system for performing contact manipulations on unstructured human-intensive production lines. The proposed system will rely on three main pillars, including artificial robot skin (innovative event-based touch sensor), dexterous robot platform (advanced tactile servos and learning methods), and advanced haptic sensor arm (innovative mechanical version of the skin for manipulation). The tactile skin is an important factor allowing the processing of various assembly parts with a high degree of dexterity and strength. The robot control will be designed to work safely in close proximity to the operating people. This will move existing autonomous robot systems to the next level, where contact-heavy tasks will lead to higher productivity and a more ergonomic work environment. Consequently, small-scale production in Kazakhstan will be increased as localized turnkey solutions become more affordable.

Project Impact

The technical results of the project are following:
1. Design and implementation of the event-based tactile sensing system;
2. Fabrication know-how of the functional distributed dynamic sensing artificial tactile sensor incorporating event-based camera;
3. manufacturing of tactile sensing system that enables beyond the current manipulation skills;
4. manipulation with the event-based sensor: control algorithms, relying on reinforcement learning approach, for soft and rigid object exploration and manipulation.
Effective start/end date1/1/2112/31/23