4.7 Article

PWStableNet: Learning Pixel-Wise Warping Maps for Video Stabilization

期刊

IEEE TRANSACTIONS ON IMAGE PROCESSING
卷 29, 期 -, 页码 3582-3595

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIP.2019.2963380

关键词

Video stabilization; pixel-wise warping; cascade networks

资金

  1. National Key Research and Development Program of China [2016YFC0201003]
  2. Technological Innovation Project for New Energy and Intelligent Networked Automobile Industry of Anhui Province
  3. Fundamental Research Funds for the Central Universities

向作者/读者索取更多资源

As the videos captured by hand-held cameras are often perturbed by high-frequency jitters, stabilization of these videos is an essential task. Many video stabilization methods have been proposed to stabilize shaky videos. However, most methods estimate one global homography or several homographies based on fixed meshes to warp the shaky frames into their stabilized views. Due to the existence of parallax, such single or a few homographies can not well handle the depth variation. In contrast to these traditional methods, we propose a novel video stabilization network, called PWStableNet, which comes up pixel-wise warping maps, i.e., potentially different warping for different pixels, and stabilizes each pixel to its stabilized view. To our best knowledge, this is the first deep learning based pixel-wise video stabilization. The proposed method is built upon a multi-stage cascade encoder-decoder architecture and learns pixel-wise warping maps from consecutive unstable frames. Inter-stage connections are also introduced to add feature maps of a former stage to the corresponding feature maps at a latter stage, which enables the latter stage to learn the residual from the feature maps of former stages. This cascade architecture can produce more precise warping maps at latter stages. To ensure the correct learning of pixel-wise warping maps, we use a well-designed loss function to guide the training procedure of the proposed PWStableNet. The proposed stabilization method achieves comparable performance with traditional methods, but stronger robustness and much faster processing speed. Moreover, the proposed stabilization method outperforms some typical CNN-based stabilization methods, especially in videos with strong parallax. Codes will be provided at https://github.com/mindazhao/pix-pix-warping-video-stabilization.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据