Journal
ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING
Volume 98, Issue -, Pages 119-132Publisher
ELSEVIER
DOI: 10.1016/j.isprsjprs.2014.10.002
Keywords
Geospatial object detection; Geographic image classification; Very-high-resolution (VHR); Remote sensing images; Part-based model; Collection of part detectors (COPD)
Categories
Funding
- National Science Foundation of China [91120005, 61473231, 61401357, 61333017]
- Ministry of Education of China [20136102110037]
- China Postdoctoral Science Foundation [2014M552491]
Ask authors/readers for more resources
The rapid development of remote sensing technology has facilitated us the acquisition of remote sensing images with higher and higher spatial resolution, but how to automatically understand the image contents is still a big challenge. In this paper, we develop a practical and rotation-invariant framework for multi-class geospatial object detection and geographic image classification based on collection of part detectors (COPD). The COPD is composed of a set of representative and discriminative part detectors, where each part detector is a linear support vector machine (SVM) classifier used for the detection of objects or recurring spatial patterns within a certain range of orientation. Specifically, when performing multi-class geospatial object detection, we learn a set of seed-based part detectors where each part detector corresponds to a particular viewpoint of an object class, so the collection of them provides a solution for rotation-invariant detection of multi-class objects. When performing geographic image classification, we utilize a large number of pre-trained part detectors to discovery distinctive visual parts from images and use them as attributes to represent the images. Comprehensive evaluations on two remote sensing image databases and comparisons with some state-of-the-art approaches demonstrate the effectiveness and superiority of the developed framework. (C) 2014 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS). Published by Elsevier B.V. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available