3.8 Proceedings Paper

Online Social Touch Pattern Recognition with Multi-modal-sensing Modular Tactile Interface

Publisher

IEEE
DOI: 10.1109/urai.2019.8768706

Keywords

-

Ask authors/readers for more resources

The capability of recognizing various social touch patterns is necessary for robots functioning for touch-based social interaction, which is effective in many robot applications. Literature has focused on the novelty of the recognition system or improvements in classification accuracy based on publicly available datasets. In this paper, we propose an integrated framework of implementing social touch recognition system for various robots, which consists of three complementary principles: 1) multi-modal tactile sensing, 2) a modular design, and 3) a social touch pattern classifier capable of learning temporal features. The approach is evaluated by an implemented Multimodal-sensing Modular Tactile Interface prototype, while for the classifiers, three learning methods-HMM, LSTM, and 3D-CNN-have been tested. The trained classifiers, which can run online in robot's embedded system, predict 18 classes of social touch pattern. Results of the online validation test offer that all three methods are promising with the best accuracy of 88.86%. Especially, the stable performance of 3D-CNN indicates that learning 'spatiotemporal' features from tactile data would be more effective. Through this validation process, we have confirmed that our framework can be easily adopted and secures robust performance for social touch pattern recognition.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available