Journal
JOURNAL OF MANUFACTURING SCIENCE AND ENGINEERING-TRANSACTIONS OF THE ASME
Volume 144, Issue 5, Pages -Publisher
ASME
DOI: 10.1115/1.4053806
Keywords
robot; assembly; multimodal data; human-robot collaboration; brain robotics
Ask authors/readers for more resources
This article investigates multimodal data-driven robot control for human-robot collaborative assembly. A programming-free human-robot interface is designed to fuse multimodal human commands, and deep learning is explored for accurate translation of brainwave command phrases into robot commands. Event-driven function blocks are used for high-level robot control, and a case study is conducted to demonstrate the effectiveness of the system.
In human-robot collaborative assembly, leveraging multimodal commands for intuitive robot control remains a challenge from command translation to efficient collaborative operations. This article investigates multimodal data-driven robot control for human-robot collaborative assembly. Leveraging function blocks, a programming-free human-robot interface is designed to fuse multimodal human commands that accurately trigger defined robot control modalities. Deep learning is explored to develop a command classification system for low-latency and high-accuracy robot control, in which a spatial-temporal graph convolutional network is developed for a reliable and accurate translation of brainwave command phrases into robot commands. Then, multimodal data-driven high-level robot control during assembly is facilitated by the use of event-driven function blocks. The high-level commands serve as triggering events to algorithms execution of fine robot manipulation and assembly feature-based collaborative assembly. Finally, a partial car engine assembly deployed to a robot team is chosen as a case study to demonstrate the effectiveness of the developed system.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available