4.7 Article

Attention deep neural network for lane marking detection

期刊

KNOWLEDGE-BASED SYSTEMS
卷 194, 期 -, 页码 -

出版社

ELSEVIER
DOI: 10.1016/j.knosys.2020.105584

关键词

Deep learning; Autonomous drive; Lane markings detection; Self-attention; Semantic segmentation

向作者/读者索取更多资源

Deep learning lane marking detection algorithms based on vision in a complex scene face many challenges, such as absent markings and shadow and dazzle light. The following are the two particularly significant reasons: (1) the empirical size of the receptive fields in the deep neural network (DNN) is considerably smaller than the theoretical one; and (2) the importance of each channel in DNN is not being considered. To address both problems, we propose an attention module that combines self-attention and channel attention (called AMSC) by using a learnable coefficient in parallel. In addition, we apply AMSC in LargeFOV and propose an attention DNN for lane marking detection (modified LargeFOV). Long-range dependencies amongst pixels and channel dependencies are synchronously modelled to capture the global context and strengthen important features in the modified LargeFOV. In comparison with state-of-the-art methods that model dependencies of pixels and channels, our proposed module manifests certain properties, such as inherent parallel computing advantage and needs fewer parameters and convolution operations. Tests on the CULane dataset show that the modified LargeFOV outperforms recurrent neural network and DenseCRF by 3.7% and 5.6%, respectively, with at least 1.6x faster in computation speed, and the AMSC is 10.4x faster than SCNN with minimal performance loss. The modified LargeFOV outperforms the baseline network based on LargeFOV by 1.27% with negligible computational cost and is 1.6x faster than SCNN-LargeFOV(apply SCNN in LargeFOV) with 0.1% performance loss on TuSimple lane marking challenge dataset. (C) 2020 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据