4.7 Article

DecoupledPoseNet: Cascade Decoupled Pose Learning for Unsupervised Camera Ego-Motion Estimation

Journal

IEEE TRANSACTIONS ON MULTIMEDIA
Volume 25, Issue -, Pages 1636-1648

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TMM.2022.3144958

Keywords

Image sequences; Camera ego-motion; decoupling structure; pose estimation; rigid-aware; unsupervised learning

Ask authors/readers for more resources

In this paper, a new camera ego-motion estimation method is proposed, focusing on the coupling of rotation and translation. A cascade decoupling structure is designed to separately learn the rotation and translation of camera motion. Meanwhile, a rigid-aware unsupervised learning framework is introduced to handle rigid motion and deformations in dynamic scenarios through joint learning of optical flow, stereo disparity, and camera pose.
Although many impressive works on learning-based camera ego-motion estimation methods have been proposed recently, most of them promote the accuracy of camera pose estimation by various sequential learning with loop closure optimization, while neglecting the improvement of PoseNet itself. In this paper, we focus on the coupling of rotation and translation in ego-motion estimation, and design a cascade decoupling structure to separately learn the rotation and translation of camera relative motion between adjacent frames. Meanwhile, a rigid-aware unsupervised learning framework with iterative pose refinement scheme is proposed for camera ego-motion estimation. It can disambiguate rigid motion and deformations in dynamic scenarios by jointly learning of optical flow, stereo disparity and camera pose. Validated with evaluation experiments on the public available datasets, our method is superior to the state-of-the-art unsupervised methods, and can achieve comparable results with the supervised ones.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available