research

Filter distortion effects on telemetry signal-to-noise ratio

Abstract

The effect of filtering on the Signal-to-Noise Ratio (SNR) of a coherently demodulated band-limited signal is determined in the presence of worse-case amplitude ripple. The problem is formulated mathematically as an optimization problem in the L2-Hilbert space. The form of the worst-cast amplitude ripple is specified, and the degradation in the SNR is derived in a closed form expression. It is shown that when the maximum passband amplitude ripple is 2 delta (peak to peak), the SNR is degraded by at most (1 - delta squared), even when the ripple is unknown or uncompensated. For example, an SNR loss of less than 0.01 dB due to amplitude ripple can be assured by keeping the amplitude ripple to under 0.42 dB

    Similar works