Locomotion strategy selection for a hybrid mobile robot using time of flight depth sensor

Artur Saudabayev, Farabi Kungozhin, Damir Nurseitov, Huseyin Atakan Varol

Research output: Contribution to journalArticle

8 Citations (Scopus)

Abstract

The performance of a mobile robot can be improved by utilizing different locomotion modes in various terrain conditions. This creates the necessity of having a supervisory controller capable of recognizing different terrain types and changing the locomotion mode of the robot accordingly. This work focuses on the locomotion strategy selection problem for a hybrid legged wheeled mobile robot. Supervisory control of the robot is accomplished by the terrain recognizer, which classifies depth images obtained from a commercial time of flight depth sensor and selects different locomotion mode subcontrollers based on the recognized terrain type. For the terrain recognizer, a database is generated consisting of five terrain classes (Uneven, Level Ground, Stair Up, Stair Down, and Nontraversable). Depth images are enhanced using confidence map based filtering. The accuracy of the terrain classification using Support Vector Machine classifier for the testing database in five-class terrain recognition problem is 97%. Real-world experiments assess the locomotion abilities of the quadruped and the capability of the terrain recognizer in real-time settings. The results of these experiments show depth images processed in real time using machine learning algorithms can be used for the supervisory control of hybrid robots with legged and wheeled locomotion capabilities.

Original languageEnglish
Article number425732
JournalJournal of Sensors
Volume2015
DOIs
Publication statusPublished - 2015

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Instrumentation
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Locomotion strategy selection for a hybrid mobile robot using time of flight depth sensor'. Together they form a unique fingerprint.

  • Cite this