4.7 Article

Fast Color Blending for Seamless Image Stitching

Journal

IEEE GEOSCIENCE AND REMOTE SENSING LETTERS
Volume 16, Issue 7, Pages 1115-1119

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/LGRS.2019.2893210

Keywords

Color blending; image stitching; superpixels; unmanned aerial vehicle (UAV) aerial images

Funding

  1. Key Project of the National Natural Science Foundation of China [61731009]
  2. National Natural Science Foundation of China [61871185]
  3. Chenguang Program through the Shanghai Education Development Foundation
  4. Shanghai Municipal Education Commission [17CG25]

Ask authors/readers for more resources

In this letter, we propose a fast and robust method for stitching overlapped images captured by the unmanned aerial vehicle. First, we apply the shape-preserving half-projective method to precisely and stably align a pair of partially overlapped input images. Then, an optimal stitching line is searched to remove ghosts caused by the moving objects in the overlapped area. We subsequently propose a color blending method to eliminate all the color inconsistencies in the prealigned image. In accordance with the color differences of the pixels on the optimal stitching seam, we utilize weighted value coordinate interpolation algorithms to compute accurate color changes for all the pixels in the target image. The calculated color changes are then added to the target image to remove the color inconsistency. Furthermore, we introduce the superpixel segmentation to divide the target image into a reduced number of superpixels, and we assign each superpixel the same color change value. Such a superpixel level operation can greatly reduce the computational complexity. Experiments show that our method is promising to achieve effective and efficient stitching results.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available