Citation

Abstract

The Integrate-and-Dump Filter (IDF) is used as a matched filter for the detection of signals in additive white Gaussian noise. In this article, the performance of the digital integrate-and-dump filter is evaluated. The case considered is when symbol times are known and the sampling clock is free running at a constant rate, i.e., the sampling clock is not phase locked to the symbol clock. Degradations in the output signal-to-noise ratio of the digital implementation due to sampling rate, sampling offset, and finite bandwidth, resulting from the anti-aliasing low-pass prefilter, are computed and compared with those of the analog counterpart. It is shown that the digital IDF performs within 0.6 dB of the ideal analog IDF whenever the prefilter bandwidth exceeds four times the symbol rate and when sampling is performed at the Nyquist rate. The loss can be reduced to 0.3 dB by doubling the sampling rate, where 0.2 dB loss results from finite bandwidth and 0.1 dB results from the digital IDF.

Details

Volume
42-91
Published
November 15, 1987
Pages
158–173
File Size
732.4 KB