4.5 Article

EMG-based decoding of grasp gestures in reaching-to-grasping motions

Journal

ROBOTICS AND AUTONOMOUS SYSTEMS
Volume 91, Issue -, Pages 59-70

Publisher

ELSEVIER SCIENCE BV
DOI: 10.1016/j.robot.2016.12.014

Keywords

Reach-to-grasp; Grasp planning; Machine learning; Electromyographic(EMG) signals; Prosthesis

Funding

  1. Swiss National Science Foundation through the National Centre of Competence in Research in Robotics [51NF40 - 160592]
  2. Bertarelli Foundation
  3. Swiss National Science Foundation (SNF) [51NF40-160592] Funding Source: Swiss National Science Foundation (SNF)

Ask authors/readers for more resources

Predicting the grasping function during reach-to-grasp motions is essential for controlling a prosthetic hand or a robotic assistive device. An early accurate prediction increases the usability and the comfort of a prosthetic device. This work proposes an electromyographic-based learning approach that decodes the grasping intention at an early stage of reach-to-grasp motion, i.e. before the final grasp/hand pre-shape takes place. Superficial electrodes and a Cyberglove were used to record the arm muscle activity and the finger joints during reach-to-grasp motions. Our results showed a 90% accuracy for the detection of the final grasp about 0.5 s after motion onset.This paper also examines the effect of different objects' distances and different motion speeds on the detection time and accuracy of the classifier. The use of our learning approach to control a 16-degrees of freedom robotic hand confirmed the usability of our approach for the real-time control of robotic devices. (C) 2017 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available