Journal
IEEE ACCESS
Volume 7, Issue -, Pages 85688-85695Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/ACCESS.2019.2926092
Keywords
Weather station networks; ground-based cloud classification; hierarchical multimodal fusion; convolutional neural network
Categories
Funding
- National Natural Science Foundation of China [61501327, 61711530240]
- Natural Science Foundation of Tianjin [17JCZDJC30600]
- Fund of Tianjin Normal University [135202RC1703]
- Open Projects Program of National Laboratory of Pattern Recognition [201800002]
- Tianjin Higher Education Creative Team Funds Program
Ask authors/readers for more resources
Recently, the multimodal information is taken into consideration for ground-based cloud classification in weather station networks, but intrinsic correlations between the multimodal information and the visual information cannot be mined sufficiently. We propose a novel approach called hierarchical multimodal fusion (HMF) for ground-based cloud classification in weather station networks, which fuses the deep multimodal features and the deep visual features in different levels, i.e., low-level fusion and high-level fusion. The low-level fusion directly fuses the heterogeneous features, which focuses on the modality-specific fusion. The high-level fusion integrates the output of low-level fusion with deep visual features and deep multimodal features, which could learn complex correlations among them owing to the deep fusion structure. We employ one loss function to train the overall framework of the HMF so as to improve the discrimination of cloud representations. The experimental results on the MGCD dataset indicate that our method outperforms other methods, which verifies the effectiveness of the HMF in ground-based cloud classification.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available