3.8 Proceedings Paper

Unsupervised Deep Epipolar Flow for Stationary or Dynamic Scenes

Publisher

IEEE
DOI: 10.1109/CVPR.2019.01237

Keywords

-

Funding

  1. Australia Centre for Robotic Vision
  2. Natural Science Foundation of China [61871325, 61420106007]
  3. Australian Research Council (ARC) [LE190100080, CE140100016, DP190102261]
  4. Data61 CSIRO
  5. Australian Research Council [LE190100080] Funding Source: Australian Research Council

Ask authors/readers for more resources

Unsupervised deep learning for optical flow computation has achieved promising results. Most existing deep-net based methods rely on image brightness consistency and local smoothness constraint to train the networks. Their performance degrades at regions where repetitive textures or occlusions occur. In this paper, we propose Deep Epipolar Flow, an unsupervised optical flow method which incorporates global geometric constraints into network learning. In particular, we investigate multiple ways of enforcing the epipolar constraint in flow estimation. To alleviate a chicken-and-egg type of problem encountered in dynamic scenes where multiple motions may be present, we propose a low-rank constraint as well as a union-of subspaces constraint for training. Experimental results on various benchmarking datasets show that our method achieves competitive performance compared with supervised methods and outperforms state-of-the-art unsupervised deep -learning methods.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available