4.7 Article

Multi-Epoch and Multi-Imagery (MEMI) Photogrammetric Workflow for Enhanced Change Detection Using Time-Lapse Cameras

Journal

REMOTE SENSING
Volume 13, Issue 8, Pages -

Publisher

MDPI
DOI: 10.3390/rs13081460

Keywords

time-lapse photogrammetry; multi-view stereo; 3D point clouds; change detection; rockslope monitoring; Multi-Epoch and Multi-Imagery (MEMI)

Funding

  1. PROMONTEC Project - Ministry of Science, Innovation and Universities (MICINN-FEDER) [CGL2017-84720-R]
  2. APIF grant - University of Barcelona
  3. European Union's Horizon 2020 research and innovation programme under a Marie Sklodowska-Curie fellowship [705215]
  4. DFG [EL 926/3-1]
  5. Marie Curie Actions (MSCA) [705215] Funding Source: Marie Curie Actions (MSCA)

Ask authors/readers for more resources

This paper presents an enhanced workflow for image-based 3D reconstruction of high-resolution models using fixed time-lapse camera systems, based on multi-epoch multi-images (MEMI) to exploit redundancy. The workflow is capable of obtaining photogrammetric models with a higher quality than the classic Structure from Motion (SfM) time-lapse photogrammetry workflow, reducing the error up to a factor of 2.
Photogrammetric models have become a standard tool for the study of surfaces, structures and natural elements. As an alternative to Light Detection and Ranging (LiDAR), photogrammetry allows 3D point clouds to be obtained at a much lower cost. This paper presents an enhanced workflow for image-based 3D reconstruction of high-resolution models designed to work with fixed time-lapse camera systems, based on multi-epoch multi-images (MEMI) to exploit redundancy. This workflow is part of a fully automatic working setup that includes all steps: from capturing the images to obtaining clusters from change detection. The workflow is capable of obtaining photogrammetric models with a higher quality than the classic Structure from Motion (SfM) time-lapse photogrammetry workflow. The MEMI workflow reduced the error up to a factor of 2 when compared to the previous approach, allowing for M3C2 standard deviation of 1.5 cm. In terms of absolute accuracy, using LiDAR data as a reference, our proposed method is 20% more accurate than models obtained with the classic workflow. The automation of the method as well as the improvement of the quality of the 3D reconstructed models enables accurate 4D photogrammetric analysis in near-real time.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available