4.7 Article

Domain adaptation from daytime to nighttime: A situation-sensitive vehicle detection and traffic flow parameter estimation framework

Journal

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.trc.2020.102946

Keywords

Vehicle detection; Deep learning; Domain adaptation; Traffic flow parameter

Funding

  1. National Key Research and Development Program of China [2019YFB1600100]
  2. National Natural Science Foundation of China [61973045]
  3. Shaanxi Province Key Development Project [S2018-YF-ZDGY-0300]
  4. Fundamental Research Funds for the Central Universities [300102248403]
  5. Joint Laboratory of Internet of Vehicles - Ministry of Education and China Mobile [213024170015]
  6. Application of Basic Research Project for National Ministry of Transport [2015319812060]
  7. NVIDIA GPU Grant
  8. Amazon Web Services (AWS) Cloud Credits for Research Award

Ask authors/readers for more resources

The article explores how to utilize daytime images to assist in nighttime vehicle detection, proposing a situation-sensitive method based on Faster R-CNN and domain adaptation. Experimental results using new datasets demonstrate the accuracy and effectiveness of the proposed method.
Vehicle detection in traffic surveillance images is an important approach to obtain vehicle data and rich traffic flow parameters. Recently, deep learning based methods have been widely used in vehicle detection with high accuracy and efficiency. However, deep learning based methods require a large number of manually labeled ground truths (bounding box of each vehicle in each image) to train the Convolutional Neural Networks (CNN). In the modern urban surveillance cameras, there are already many manually labeled ground truths in daytime images for training CNN, while there are little or much less manually labeled ground truths in nighttime images. In this paper, we focus on the research to make maximum usage of labeled daytime images (Source Domain) to help the vehicle detection in unlabeled nighttime images (Target Domain). For this purpose, we propose a new situation-sensitive method based on Faster R-CNN with Domain Adaptation (DA) to improve the vehicle detection at nighttime. Furthermore, a situation-sensitive traffic flow parameter estimation method is developed based on the traffic flow theory. We collected a new dataset of 2,200 traffic images (1,200 for daytime and 1,000 for nighttime) of 57,059 vehicles to evaluate the proposed method for the vehicle detection. Another new dataset with three 1,800-frame daytime videos and one 1,800-frame nighttime video of about 260 K vehicles was collected to evaluate and show the estimated traffic flow parameters in different situations. The experimental results show the accuracy and effectiveness of the proposed method.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available