4.7 Article

Flow-Edge Guided Unsupervised Video Object Segmentation

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCSVT.2021.3057872

关键词

Motion segmentation; Feature extraction; Object segmentation; Image edge detection; Task analysis; Image segmentation; Computer vision; Video segmentation; optical flow; attention mechanisms; Index Terms; deep learning

资金

  1. National Key Research and Development Program of China [2018AAA0102200]
  2. National Natural Science Foundation of China [61976049, 61632007, 62072080, U20B2063]
  3. Fundamental Research Funds for the Central Universities [ZYGX2019Z015]
  4. Sichuan Science and Technology Program, China [2018GZDZX0032, 2019YFG0003, 2019ZDZX0008, 2019YFG0533, 2020YFS0057]

向作者/读者索取更多资源

This paper presents a novel model called Flow Edge-based Motion-Attentive Network (FEM-Net) for addressing the problem of unsupervised video object segmentation. Experimental results show that the proposed FEM-Net outperforms existing methods on two challenging public benchmarks.
Recently, deep learning techniques have achieved significant improvements in unsupervised video object segmentation (UVOS). However, many of existing approach cannot accurately identify the foreground objects and the background as they commonly use the coarse temporal features (e.g., optical flow and multi-frames attention). In this paper, we present a novel model termed Flow Edge-based Motion-Attentive Network (FEM-Net), to address the unsupervised video object segmentation problem. Firstly, a motion-attentive encoder is used to jointly learn the spatial and temporal features. Then, a Flow Edge Connect (FEC) module is designed to hallucinate edges of the ambiguous or missing region in the optical flow. During the segmentation stage, the complementary temporal feature composed by the motion-attentive feature and flow edge is fed into a decoder to infer the salient foreground objects. Experimental results on two challenging public benchmarks (i.e. DAVIS-16 and FBMS) demonstrate that the proposed FEM-Net compares favorably against the state-of-the-art methods.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据