4.3 Article

Realizing whole-body tactile interactions with a self-organizing, multi-modal artificial skin on a humanoid robot

Journal

ADVANCED ROBOTICS
Volume 29, Issue 1, Pages 51-67

Publisher

TAYLOR & FRANCIS LTD
DOI: 10.1080/01691864.2014.952493

Keywords

whole-body tactile interaction; artificial skin; self-organization; humanoid robots

Categories

Funding

  1. DFG cluster of excellence Cognition for Technical systems - CoTeSys
  2. Joint Robotics Laboratory, Tsukuba, Japan [UMI3218/CRT]

Ask authors/readers for more resources

In this paper, we present a new approach to realize whole-body tactile interactions with a self-organizing, multi-modal artificial skin on a humanoid robot. We, therefore, equipped the whole upper body of the humanoid HRP-2 with various patches of CellulARSkin - a modular artificial skin. In order to automatically handle a potentially high number of tactile sensor cells and motors units, the robot uses open-loop exploration motions, and distributed accelerometers in the artificial skin cells, to acquire its self-centered sensory-motor knowledge. This body self-knowledge is then utilized to transfer multi-modal tactile stimulations into reactive body motions. Tactile events provide feedback on changes of contact on the whole-body surface. We demonstrate the feasibility of our approach on a humanoid, here HRP-2, grasping large and unknown objects only via tactile feedback. Kinesthetically taught grasping trajectories, are reactively adapted to the size and stiffness of different test objects. Our paper contributes the first realization of a self-organizing tactile sensor-behavior mapping on a full-sized humanoid robot, enabling a position controlled robot to compliantly handle objects.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.3
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available