4.7 Article

When Visual Disparity Generation Meets Semantic Segmentation: A Mutual Encouragement Approach

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TITS.2020.3027556

Keywords

Semantics; Task analysis; Image segmentation; Estimation; Image resolution; Visualization; Convolution; Scene parsing; stereo matching; semantic segmentation; mutual encouragement network (MENet)

Funding

  1. National Natural Science Foundation of China [61872187, 62072246, 61773215]
  2. Natural Science Foundation of Jiangsu Province [BK20180727]
  3. National Defense Pre-Research Foundation [41412010101, 41412010302]

Ask authors/readers for more resources

In this paper, we propose a Mutual Encouragement Network (MENet) that simultaneously achieves semantic segmentation and depth estimation, and outperforms existing methods in experiments.
Semantic segmentation and depth estimation play important roles in the field of autonomous driving. In recent years, the advantages of Convolutional Neural Networks (CNNs) have allowed these two topics to flourish. However, people always solve these two tasks separately and rarely solve them in a united model. In this paper, we propose a Mutual Encouragement Network (MENet), which includes a semantic segmentation branch and a disparity regression branch, and simultaneously generates semantic map and visual disparity. In the cost volume construction phase, the depth information is embedded in the semantic segmentation branch to increase contextual understanding. Similarly, the semantic information is also included in the disparity regression branch to generate more accurate disparity. Two branches mutually promote each other during training phase and inference phase. We conducted our method on the popular dataset KITTI, and the experimental results show that our method can outperform the state-of-the-art methods on both visual disparity generation and semantic segmentation. In addition, extensive ablation studies also demonstrate that the two tasks in our method can facilitate each other significantly with the proposed approach.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available