4.8 Article

Fast animal pose estimation using deep neural networks

期刊

NATURE METHODS
卷 16, 期 1, 页码 117-+

出版社

NATURE RESEARCH
DOI: 10.1038/s41592-018-0234-5

关键词

-

资金

  1. NIH BRAIN Initiative Award [R01 NS104899-01, R01 MH115750]
  2. NSF BRAIN Initiative EAGER Award
  3. Nancy Lurie Marks Family Foundation
  4. NIH [R01 NS045193]
  5. HHMI Faculty Scholar Award
  6. NSF [GRFP DGE-1148900]
  7. Center for the Physics of Biological Function - National Science Foundation (NSF) [PHY-1734030]
  8. NATIONAL INSTITUTE OF MENTAL HEALTH [R01MH115750] Funding Source: NIH RePORTER
  9. NATIONAL INSTITUTE OF NEUROLOGICAL DISORDERS AND STROKE [R01NS045193, R01NS104899] Funding Source: NIH RePORTER

向作者/读者索取更多资源

The need for automated and efficient systems for tracking full animal pose has increased with the complexity of behavioral data and analyses. Here we introduce LEAP (LEAP estimates animal pose), a deep-learning-based method for predicting the positions of animal body parts. This framework consists of a graphical interface for labeling of body parts and training the network. LEAP offers fast prediction on new data, and training with as few as 100 frames results in 95% of peak performance. We validated LEAP using videos of freely behaving fruit flies and tracked 32 distinct points to describe the pose of the head, body, wings and legs, with an error rate of < 3% of body length. We recapitulated reported findings on insect gait dynamics and demonstrated LEAP's applicability for unsupervised behavioral classification. Finally, we extended the method to more challenging imaging situations and videos of freely moving mice.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据