Journal
IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT
Volume 69, Issue 5, Pages 1881-1893Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIM.2019.2919354
Keywords
Dynamic vision sensor (DVS); force estimation; haptics; material classification; vision-based measurements (VBMs)
Funding
- Kingston University London
- Khalifa University of Science and Technology [RC1-2018-KUCARS]
Ask authors/readers for more resources
In this paper, a novel vision-based measurement (VBM) approach is proposed to estimate the contact force and classify materials in a single grasp. This approach is the first event-based tactile sensor which utilizes the recent technology of neuromorphic cameras. This novel approach provides higher sensitivity, a lower latency, and less computational and power consumption compared to other conventional vision-based techniques. Moreover, the dynamic vision sensor (DVS) has a higher dynamic range which increases the sensor sensitivity and performance in poor lighting conditions. Two time-series machine learning methods, namely, time delay neural network (TDNN) and Gaussian process (GP) are developed to estimate the contact force in a grasp. A deep neural network (DNN) is proposed to classify the object materials. Forty-eight experiments are conducted for four different materials to validate the proposed methods and compare them against a piezoresistive force sensor measurements. A leave-one-out cross-validation technique is implemented to evaluate and analyze the performance of the proposed machine learning methods. The contact force is successfully estimated with a mean squared error of 0.16 and 0.17 N for TDNN and GP, respectively. Four materials are classified with an average accuracy of 79.17% using unseen experimental data. The results show the applicability of event-based sensors for grasping applications.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available