Journal
IEEE TRANSACTIONS ON INTELLIGENT VEHICLES
Volume 3, Issue 4, Pages 522-533Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIV.2018.2873920
Keywords
ADAS; driving video; computer vision; environment perception; autonomous driving; weather and illumination; data mining; machine learning
Categories
Funding
- Toyota CSRC, USA
- Department of Transportation, USA, under UTC CrIS Project
Ask authors/readers for more resources
Autonomous vehicle safety driving requires many vision tasks, such as road segmentation, lane mark detection, and vehicle recognition by frontal cameras. However, all these tasks can suffer due to drastic changes of weather and illumination. To make vision a more robust function in driving, as it is for human drivers, this study models a spectrum of weather and illuminations visible in road environments. We implement big-data mining on naturalistic driving videos through four seasons to understand the influence of weather and illumination. Weather sensitive regions are sampled as image features to describe the illumination models qualitatively and quantitatively. To understand how many distinct weather and illumination types exist for vision tasks, clustering is performed by unsupervised learning on all video samples. Typical views of a spectrum of weather and illumination conditions are generated using K-means clustering of feature distributions; we also find a stable number of clusters. The learned data are used to classify a driving view into one illumination type for guiding the road perception modules in autonomous driving. We further explore the sparse coding of vehicle views under various weather and illuminations.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available