4.7 Article Proceedings Paper

Multi-Modal and Multi-Temporal Data Fusion: Outcome of the 2012 GRSS Data Fusion Contest

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/JSTARS.2013.2245860

Keywords

Data fusion; LiDAR; multi-modal; multi-temporal; optical; SAR; urban; VHR imagery

Funding

  1. IEEE Geoscience and Remote Sensing Society (GRSS)
  2. DigitalGlobe, Inc.
  3. National Air and Space Intelligence Center (NASIC) under the Advanced Technical Exploitation Program (ATEP) [FA8604-09-D-7976]

Ask authors/readers for more resources

The 2012 Data Fusion Contest organized by the Data Fusion Technical Committee (DFTC) of the IEEE Geo-science and Remote Sensing Society (GRSS) aimed at investigating the potential use of very high spatial resolution (VHR) multi-modal/multi-temporal image fusion. Three different types of data sets, including spaceborne multi-spectral, spaceborne synthetic aperture radar (SAR), and airborne light detection and ranging (LiDAR) data collected over the downtown San Francisco area were distributed during the Contest. This paper highlights the three awarded research contributions which investigate (i) a new metric to assess urban density (UD) from multi-spectral and LiDAR data, (ii) simulation-based techniques to jointly use SAR and LiDAR data for image interpretation and change detection, and (iii) radiosity methods to improve surface reflectance retrievals of optical data in complex illumination environments. In particular, they demonstrate the usefulness of LiDAR data when fused with optical or SAR data. We believe these interesting investigations will stimulate further research in the related areas.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available