4.0 Article

Object classification using bimodal perception data extracted from single-touch robotic grasps

Publisher

UNIV POLITECNICA VALENCIA, EDITORIAL UPV
DOI: 10.4995/riai.2019.10923

Keywords

Robotic manipulators; Proprioceptive-tactile perception; Propioceptive-tactile learning; Objects classification; Objects recognition

Ask authors/readers for more resources

This work presents a method to classify grasped objects with a multi-fingered robotic hand combining proprioceptive and tactile data in a hybrid descriptor. The proprioceptive data are obtained from the joint positions of the hand and the tactile data are obtained from the contact registered by pressure cells installed on the phalanges. The proposed approach allows us to identify the grasped object by learning the contact geometry and stiffness from the readings by sensors. In this work, we show that using bimodal data of different nature along with supervised learning techniques improves the recognition rate. In experimentation, more than 3000 grasps of up to 7 different domestic objects have been carried out, obtaining an average F1 score around 95 %, performing just a single grasp. In addition, the generalization of the method has been verified by training our system with certain objects and classifying new, similar ones without any prior knowledge.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.0
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available