4.0 Article

A modeling approach for large spatial datasets

Journal

JOURNAL OF THE KOREAN STATISTICAL SOCIETY
Volume 37, Issue 1, Pages 3-10

Publisher

SPRINGER HEIDELBERG
DOI: 10.1016/j.jkss.2007.09.001

Keywords

-

Ask authors/readers for more resources

For Gaussian spatial processes observed at n irregularly sited locations, exact computation of the likelihood generally requires O(n(3)) operations and O(n(2)) memory. If we can write the covariance function of the process as a nugget effect plus a term of moderate rank, then both the number of computations and memory requirements can be greatly reduced. However, while such models can capture larger-scale structure of spatial processes, they have trouble describing the local behavior of spatial processes accurately. If the nugget effect is replaced by a covariance function with compact support, one can get a much better model for the local behavior of the process and still carry out exact likelihood calculations on quite large datasets. This approach is applied to compute the maximum likelihood estimate for 13,216 observations from a single orbit of TOMS (Total Ozone Mapping Spectrometer) measurements of total column ozone. It is also applied to obtain likelihood-based estimates using over one million observations from 83 orbits. Replacing the nugget effect by a compactly supported covariance function leads to huge increases in likelihood, but there is still clear evidence of model misfit. (c) 2008 The Korean Statistical Society. Published by Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.0
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available