4.5 Article

Multimodal Data-Driven Robot Control for Human-Robot Collaborative Assembly

出版社

ASME
DOI: 10.1115/1.4053806

关键词

robot; assembly; multimodal data; human-robot collaboration; brain robotics

向作者/读者索取更多资源

This article investigates multimodal data-driven robot control for human-robot collaborative assembly. A programming-free human-robot interface is designed to fuse multimodal human commands, and deep learning is explored for accurate translation of brainwave command phrases into robot commands. Event-driven function blocks are used for high-level robot control, and a case study is conducted to demonstrate the effectiveness of the system.
In human-robot collaborative assembly, leveraging multimodal commands for intuitive robot control remains a challenge from command translation to efficient collaborative operations. This article investigates multimodal data-driven robot control for human-robot collaborative assembly. Leveraging function blocks, a programming-free human-robot interface is designed to fuse multimodal human commands that accurately trigger defined robot control modalities. Deep learning is explored to develop a command classification system for low-latency and high-accuracy robot control, in which a spatial-temporal graph convolutional network is developed for a reliable and accurate translation of brainwave command phrases into robot commands. Then, multimodal data-driven high-level robot control during assembly is facilitated by the use of event-driven function blocks. The high-level commands serve as triggering events to algorithms execution of fine robot manipulation and assembly feature-based collaborative assembly. Finally, a partial car engine assembly deployed to a robot team is chosen as a case study to demonstrate the effectiveness of the developed system.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据