Data Descriptor: Human grasping database for activities of daily living with depth, color and kinematic data streams

Artur Saudabayev, Zhanibek Rysbek, Raykhan Khassenova, Huseyin Atakan Varol

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

This paper presents a grasping database collected from multiple human subjects for activities of daily living in unstructured environments. The main strength of this database is the use of three different sensing modalities: color images from a head-mounted action camera, distance data from a depth sensor on the dominant arm and upper body kinematic data acquired from an inertial motion capture suit. 3826 grasps were identified in the data collected during 9-hours of experiments. The grasps were grouped according to a hierarchical taxonomy into 35 different grasp types. The database contains information related to each grasp and associated sensor data acquired from the three sensor modalities. We also provide our data annotation software written in Matlab as an open-source tool. The size of the database is 172 GB. We believe this database can be used as a stepping stone to develop big data and machine learning techniques for grasping and manipulation with potential applications in rehabilitation robotics and intelligent automation.

Original languageEnglish
Article number180101
JournalScientific data
Volume5
DOIs
Publication statusPublished - May 29 2018

Fingerprint

Grasping
Data Streams
Descriptors
Kinematics
Color
Sensors
Sensor
Modality
Taxonomies
Motion Capture
Patient rehabilitation
Rehabilitation
Learning systems
Color Image
Taxonomy
automation
Robotics
taxonomy
Open Source
Automation

ASJC Scopus subject areas

  • Statistics and Probability
  • Information Systems
  • Education
  • Computer Science Applications
  • Statistics, Probability and Uncertainty
  • Library and Information Sciences

Cite this

Data Descriptor : Human grasping database for activities of daily living with depth, color and kinematic data streams. / Saudabayev, Artur; Rysbek, Zhanibek; Khassenova, Raykhan; Atakan Varol, Huseyin.

In: Scientific data, Vol. 5, 180101, 29.05.2018.

Research output: Contribution to journalArticle

@article{d9f4020e46504e05a20f531836f36dc5,
title = "Data Descriptor: Human grasping database for activities of daily living with depth, color and kinematic data streams",
abstract = "This paper presents a grasping database collected from multiple human subjects for activities of daily living in unstructured environments. The main strength of this database is the use of three different sensing modalities: color images from a head-mounted action camera, distance data from a depth sensor on the dominant arm and upper body kinematic data acquired from an inertial motion capture suit. 3826 grasps were identified in the data collected during 9-hours of experiments. The grasps were grouped according to a hierarchical taxonomy into 35 different grasp types. The database contains information related to each grasp and associated sensor data acquired from the three sensor modalities. We also provide our data annotation software written in Matlab as an open-source tool. The size of the database is 172 GB. We believe this database can be used as a stepping stone to develop big data and machine learning techniques for grasping and manipulation with potential applications in rehabilitation robotics and intelligent automation.",
author = "Artur Saudabayev and Zhanibek Rysbek and Raykhan Khassenova and {Atakan Varol}, Huseyin",
year = "2018",
month = "5",
day = "29",
doi = "10.1038/sdata.2018.101",
language = "English",
volume = "5",
journal = "Scientific data",
issn = "2052-4463",
publisher = "Nature Publishing Group",

}

TY - JOUR

T1 - Data Descriptor

T2 - Human grasping database for activities of daily living with depth, color and kinematic data streams

AU - Saudabayev, Artur

AU - Rysbek, Zhanibek

AU - Khassenova, Raykhan

AU - Atakan Varol, Huseyin

PY - 2018/5/29

Y1 - 2018/5/29

N2 - This paper presents a grasping database collected from multiple human subjects for activities of daily living in unstructured environments. The main strength of this database is the use of three different sensing modalities: color images from a head-mounted action camera, distance data from a depth sensor on the dominant arm and upper body kinematic data acquired from an inertial motion capture suit. 3826 grasps were identified in the data collected during 9-hours of experiments. The grasps were grouped according to a hierarchical taxonomy into 35 different grasp types. The database contains information related to each grasp and associated sensor data acquired from the three sensor modalities. We also provide our data annotation software written in Matlab as an open-source tool. The size of the database is 172 GB. We believe this database can be used as a stepping stone to develop big data and machine learning techniques for grasping and manipulation with potential applications in rehabilitation robotics and intelligent automation.

AB - This paper presents a grasping database collected from multiple human subjects for activities of daily living in unstructured environments. The main strength of this database is the use of three different sensing modalities: color images from a head-mounted action camera, distance data from a depth sensor on the dominant arm and upper body kinematic data acquired from an inertial motion capture suit. 3826 grasps were identified in the data collected during 9-hours of experiments. The grasps were grouped according to a hierarchical taxonomy into 35 different grasp types. The database contains information related to each grasp and associated sensor data acquired from the three sensor modalities. We also provide our data annotation software written in Matlab as an open-source tool. The size of the database is 172 GB. We believe this database can be used as a stepping stone to develop big data and machine learning techniques for grasping and manipulation with potential applications in rehabilitation robotics and intelligent automation.

UR - http://www.scopus.com/inward/record.url?scp=85047853213&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85047853213&partnerID=8YFLogxK

U2 - 10.1038/sdata.2018.101

DO - 10.1038/sdata.2018.101

M3 - Article

AN - SCOPUS:85047853213

VL - 5

JO - Scientific data

JF - Scientific data

SN - 2052-4463

M1 - 180101

ER -