4.7 Article

Driving singing behaviour in songbirds using a multi-modal, multi-agent virtual environment

期刊

SCIENTIFIC REPORTS
卷 12, 期 1, 页码 -

出版社

NATURE PORTFOLIO
DOI: 10.1038/s41598-022-16456-0

关键词

-

向作者/读者索取更多资源

Interactive biorobotics provides a unique experimental potential to study social communication mechanisms. However, our ability to build expressive robots that exhibit complex behaviors is limited. This study presents a modular, audio-visual 2D virtual environment that allows multi-modal, multi-agent interaction to investigate social communication mechanisms. The system is based on event processing and enables complex computation. The results show that the environment can elicit normal behavioral responses in songbirds, and an unsupervised song motif detector is developed for manipulating the virtual social environment. This virtual environment represents a first step in real-time automatic behavior annotation and animal-computer interaction.
Interactive biorobotics provides unique experimental potential to study the mechanisms underlying social communication but is limited by our ability to build expressive robots that exhibit the complex behaviours of birds and small mammals. An alternative to physical robots is to use virtual environments. Here, we designed and built a modular, audio-visual 2D virtual environment that allows multi-modal, multi-agent interaction to study mechanisms underlying social communication. The strength of the system is an implementation based on event processing that allows for complex computation. We tested this system in songbirds, which provide an exceptionally powerful and tractable model system to study social communication. We show that pair-bonded zebra finches (Taeniopygia guttata) communicating through the virtual environment exhibit normal call timing behaviour, males sing female directed song and both males and females display high-intensity courtship behaviours to their mates. These results suggest that the environment provided is sufficiently natural to elicit these behavioral responses. Furthermore, as an example of complex behavioral annotation, we developed a fully unsupervised song motif detector and used it to manipulate the virtual social environment of male zebra finches based on the number of motifs sung. Our virtual environment represents a first step in real-time automatic behaviour annotation and animal-computer interaction using higher level behaviours such as song. Our unsupervised acoustic analysis eliminates the need for annotated training data thus reducing labour investment and experimenter bias.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据