期刊
ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING
卷 169, 期 -, 页码 421-435出版社
ELSEVIER
DOI: 10.1016/j.isprsjprs.2020.06.006
关键词
Self-attention; Transformer; Time series classification; Multitemporal Earth observation; Crop type mapping; Vegetation monitoring; Deep learning
类别
资金
- German Federal Ministry for Economic Affairs and Energy (BMWi) [50EE1908]
The amount of available Earth observation data has increased dramatically in recent years. Efficiently making use of the entire body of information is a current challenge in remote sensing; it demands lightweight problemagnostic models that do not require region- or problem-specific expert knowledge. End-to-end trained deep learning models can make use of raw sensory data by learning feature extraction and classification in one step, solely from data. Still, many methods proposed in remote sensing research require implicit feature extraction through data preprocessing or explicit design of features. In this work, we compare recent deep learning models on crop type classification on raw and preprocessed Sentinel 2 data. We concentrate on the common neural network architectures for time series, i.e., 1D-convolutions, recurrence, and the novel self-attention architecture. Our central findings are that data preprocessing still increased the overall classification performance for all models while the choice of model was less crucial. Self-attention and recurrent neural networks, by their architecture, outperformed convolutional neural networks on raw satellite time series. We explore this by a feature importance analysis based on gradient backpropagation that exploits the differentiable nature of deep learning models. Further, we qualitatively show how self-attention scores focus selectively on a few classification-relevant observations.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据