TY - JOUR
T1 - User-Independent Intent Recognition for Lower-Limb Prostheses Using Depth Sensing
AU - Massalin, Yerzhan
AU - Abdrakhmanova, Madina
AU - Varol, Huseyin Atakan
PY - 2017/11/20
Y1 - 2017/11/20
N2 - Objective: The intent recognizers of advanced lower limb prostheses utilize mechanical sensors on the prosthesis and/or electromyographic measurements from the residual limb. Besides the delay caused by these signals, such systems require user-specific databases to train the recognizers. In this work, our objective is the development and validation of a user-independent intent recognition framework utilizing depth sensing. Methods: We collected a depth image dataset from 12 healthy subjects engaging in a variety of routine activities. After filtering the depth images, we extracted simple features employing a recursive strategy. The feature vectors were classified using a support vector machine. For robust activity mode switching, we implemented a voting filter scheme. Results: The model selection showed that the support vector machine classifier with no dimension reduction has the highest classification accuracy. Specifically, it reached 94.1% accuracy on the testing data from four subjects. We also observed a positive trend in the accuracy of classifiers trained with data from increasing the number of subjects. Activity mode switching using a voting filter detected 732 out of 778 activity mode transitions of the four users while initiating 70 erroneous transitions during steady-state activities. Conclusion: The intent recognizer trained on multiple subjects can be used for any other subject, providing a promising solution for supervisory control of powered lower limb prostheses. Significance: A user-independent intent recognition framework has the potential to decrease or eliminate the time required for extensive data collection regiments for intent recognizer training. This could accelerate the introduction of robotic lower limb prostheses to the market.
AB - Objective: The intent recognizers of advanced lower limb prostheses utilize mechanical sensors on the prosthesis and/or electromyographic measurements from the residual limb. Besides the delay caused by these signals, such systems require user-specific databases to train the recognizers. In this work, our objective is the development and validation of a user-independent intent recognition framework utilizing depth sensing. Methods: We collected a depth image dataset from 12 healthy subjects engaging in a variety of routine activities. After filtering the depth images, we extracted simple features employing a recursive strategy. The feature vectors were classified using a support vector machine. For robust activity mode switching, we implemented a voting filter scheme. Results: The model selection showed that the support vector machine classifier with no dimension reduction has the highest classification accuracy. Specifically, it reached 94.1% accuracy on the testing data from four subjects. We also observed a positive trend in the accuracy of classifiers trained with data from increasing the number of subjects. Activity mode switching using a voting filter detected 732 out of 778 activity mode transitions of the four users while initiating 70 erroneous transitions during steady-state activities. Conclusion: The intent recognizer trained on multiple subjects can be used for any other subject, providing a promising solution for supervisory control of powered lower limb prostheses. Significance: A user-independent intent recognition framework has the potential to decrease or eliminate the time required for extensive data collection regiments for intent recognizer training. This could accelerate the introduction of robotic lower limb prostheses to the market.
KW - big data
KW - Cameras
KW - Databases
KW - Depth image processing
KW - Feature extraction
KW - intent recognition
KW - lower-limb prosthesis
KW - pattern recognition
KW - Prosthetics
KW - Real-time systems
KW - Robot sensing systems
KW - Three-dimensional displays
UR - http://www.scopus.com/inward/record.url?scp=85035808915&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85035808915&partnerID=8YFLogxK
U2 - 10.1109/TBME.2017.2776157
DO - 10.1109/TBME.2017.2776157
M3 - Article
AN - SCOPUS:85035808915
SN - 0018-9294
JO - IEEE Transactions on Biomedical Engineering
JF - IEEE Transactions on Biomedical Engineering
ER -