TY - GEN
T1 - Robot programming by demonstration of multiple tasks within a common environment
AU - Alizadeh, Tohid
AU - Saduanov, Batyrkhan
N1 - Publisher Copyright:
© 2017 IEEE.
PY - 2017/12/7
Y1 - 2017/12/7
N2 - Most of the available robot programming by demonstration (PbD) approaches focus on learning a single task, in a given environmental situation. In this paper, we propose to learn multiple tasks together, within a common environment, using one of the available PbD approaches. Task-parameterized Gaussian mixture model (TP-GMM) is used at the core of the proposed approach. A database of TP-GMMs will be constructed for the tasks, and it will be used to provide the reproduction when needed. The environment will be shared between different tasks, in other words, all the available objects will be considered as external task parameters (TPs), as they may modulate the task. During the learning part, the relevance of the task parameters will be extracted for each task, and the information will be stored together with the parameters of the corresponding updated TP-GMM. For reproduction, the end user will specify the task and the robot will be able to pick the relevant TP-GMM and the relevant task parameters and reproduce the movement. The proposed approach is tested both in simulation and using a robotic experiment.
AB - Most of the available robot programming by demonstration (PbD) approaches focus on learning a single task, in a given environmental situation. In this paper, we propose to learn multiple tasks together, within a common environment, using one of the available PbD approaches. Task-parameterized Gaussian mixture model (TP-GMM) is used at the core of the proposed approach. A database of TP-GMMs will be constructed for the tasks, and it will be used to provide the reproduction when needed. The environment will be shared between different tasks, in other words, all the available objects will be considered as external task parameters (TPs), as they may modulate the task. During the learning part, the relevance of the task parameters will be extracted for each task, and the information will be stored together with the parameters of the corresponding updated TP-GMM. For reproduction, the end user will specify the task and the robot will be able to pick the relevant TP-GMM and the relevant task parameters and reproduce the movement. The proposed approach is tested both in simulation and using a robotic experiment.
UR - http://www.scopus.com/inward/record.url?scp=85042371239&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85042371239&partnerID=8YFLogxK
U2 - 10.1109/MFI.2017.8170389
DO - 10.1109/MFI.2017.8170389
M3 - Conference contribution
AN - SCOPUS:85042371239
T3 - IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems
SP - 608
EP - 613
BT - MFI 2017 - 2017 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 13th IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, MFI 2017
Y2 - 16 November 2017 through 18 November 2017
ER -