Citation

Abstract

This article evaluates the performance of a radiometer concept that would use existing 32-GHz (Ka-band) Deep Space Network (DSN) cryogenic receivers to measure tropospheric water vapor and associated radio path-delay fluctuations on 100-s time scales. The atmospheric signal would be derived from noise temperature fluctuations detected at the output of the cryogenic high-electron-mobility transistor (HEMT) low-noise amplifier (LNA) using a power meter in an unused passband of the receiver. Three errors are examined: (1) retrieval, (2) instrument stability, and (3) antenna side-lobe contamination. Retrieval errors estimated from recent Cassini tropospheric calibration data indicate that, in principle, a single-frequency 32-GHz measurement could just meet Gravitational Wave Experiment 100-s Allan standard deviation (ASD) requirements of 4.5 × 10−15 s/s on most cloud-free days at Goldstone. However, new gain stability measurements of the DSN HEMT LNAs are reported that translate to a 100-s ASD of 8.7 × 10−15 s/s. Antenna side lobes contribute up to 7.5 × 10−15 s/s of additional error for elevation angles above 45 deg and up to 22.5 × 10−15 s/s for an elevation of 20 deg, based on data collected last year at DSS 13. The root-sum-square (rss) of these and other errors yields a net error of about 10 × 10−15 s/s to 27 × 10−15 s/s of ASD at 100 s. These errors are comparable to the uncalibrated atmosphere at Goldstone while viewing zenith on dry winter days, in which case the radiometric data would be of limited value. On more humid days, the retrieved path delay may be helpful. For example, from the Cassini passes examined as part of this study, 6 of 11 passes exceeded 30 × 10−15 s/s in the uncalibrated line-of-sight ASD at 100 s.

Keywords

water vapor radiometer WVR atmosphere path delay beam waveguide efficiency HEMT stability Allen variance Allen standard deviation

Details

Volume
42-149
Published
May 15, 2002
Pages
1–12
File Size
234.3 KB