3.8 Proceedings Paper

Event-based Vision meets Deep Learning on Steering Prediction for Self-driving Cars

出版社

IEEE
DOI: 10.1109/CVPR.2018.00568

关键词

-

资金

  1. Swiss National Center of Competence Research (NCCR) Robotics, through the Swiss National Science Foundation
  2. SNSF-ERC starting grant
  3. Ministerio de Economia, Industria y Competitividad (AEI/FEDER) of the Spanish Government [TEC2016-75981]

向作者/读者索取更多资源

Event cameras are bio-inspired vision sensors that naturally capture the dynamics of a scene, filtering out redundant information. This paper presents a deep neural network approach that unlocks the potential of event cameras on a challenging motion-estimation task: prediction of a vehicle's steering angle. To make the best out of this sensor-algorithm combination, we adapt state-of-the-art convolutional architectures to the output of event sensors and extensively evaluate the performance of our approach on a publicly available large scale event-camera dataset (approximate to 1000 km). We present qualitative and quantitative explanations of why event cameras allow robust steering prediction even in cases where traditional cameras fail, e.g. challenging illumination conditions and fast motion. Finally, we demonstrate the advantages of leveraging transfer learning from traditional to event-based vision, and show that our approach outperforms state-of-the-art algorithms based on standard cameras.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

3.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据