期刊
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS
卷 23, 期 7, 页码 6640-6653出版社
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TITS.2021.3059674
关键词
Multi-sensor fusion; radar camera fusion; severe weather conditions; self-driving cars
资金
- National Key Research and Development Program of China [2018YFB0105000]
- National Natural Science Foundation of China [U20A20333, 52072160, 51875255]
- Natural Science Foundation of Jiangsu Province [BK20180100]
- Key Research and Development Program of Jiangsu Province [BE20190102, BE2020083-3]
- Jiangsu Province's six talent peaks [TD-GDZB-022]
The study utilizes radar and camera fusion sensing methods, matching observed values through Mahalanobis distance and performing data fusion to enhance environmental perception performance in severe weather conditions.
Radar and camera information fusion sensing methods are used to solve the inherent shortcomings of the single sensor in severe weather. Our fusion scheme uses radar as the main hardware and camera as the auxiliary hardware framework. At the same time, the Mahalanobis distance is used to match the observed values of the target sequence. Data fusion based on the joint probability function method. Moreover, the algorithm was tested using actual sensor data collected from a vehicle, performing real-time environment perception. The test results show that radar and camera fusion algorithms perform better than single sensor environmental perception in severe weather, which can effectively reduce the missed detection rate of autonomous vehicle environment perception in severe weather. The fusion algorithm improves the robustness of the environment perception system and provides accurate environment perception information for the decision-making system and control system of autonomous vehicles.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据