4.6 Article

The stability of spectroscopic instruments: a unified Allan variance computation scheme

Journal

ASTRONOMY & ASTROPHYSICS
Volume 479, Issue 3, Pages 915-926

Publisher

EDP SCIENCES S A
DOI: 10.1051/0004-6361:20079188

Keywords

methods : data analysis; methods : statistical; instrumentation : spectrographs

Ask authors/readers for more resources

Context. The Allan variance is a standard technique to characterise the stability of spectroscopic instruments used in astronomical observations. The period for switching between source and reference measurement is often derived from the Allan minimum time. However, various methods are applied to compute the Allan variance spectrum and to use its characteristics in the setup of astronomical observations. Aims. We propose a new approach for the computation of the Allan variance of spectrometer data combining the advantages of the two existing methods into a unified scheme. Using the Allan variance spectrum we derive the optimum strategy for symmetric observing schemes minimising the total uncertainty of the data resulting from radiometric and drift noise. Methods. The unified Allan variance computation scheme is designed to trace total-power and spectroscopic fluctuations within the same framework. The method includes an explicit error estimate both for the individual Allan variance spectra and for the derived stability time. A new definition of the instrument stability time allows to characterise the instrument even in the case of a fluctuation spectrum shallower than 1/f, as measured for the total power fluctuations in high-electron-mobility transistors. Results. A first analysis of test measurements for the HIFI instrument shows that gain fluctuations represent the main cause of instrumental instabilities leading to large differences between the stability times relevant for measurements aiming at an accurate determination of the continuum level and for purely spectroscopic measurements. Fast switching loops are needed for a reliable determination of the continuum level, while most spectroscopic measurements can be set up in such a way that baseline residuals due to spectroscopic drifts are at a lower level than the radiometric noise. We find a non-linear impact of the binning of spectrometer channels on the resulting noise and the Allan time deviating from the description in existing theoretical treatments.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available