1 research outputs found
Fisher information as a performance metric for locally optimum processing
For a known weak signal in additive white noise, the asymptotic performance
of a locally optimum processor (LOP) is shown to be given by the Fisher
information (FI) of a standardized even probability density function (PDF) of
noise in three cases: (i) the maximum signal-to-noise ratio (SNR) gain for a
periodic signal; (ii) the optimal asymptotic relative efficiency (ARE) for
signal detection; (iii) the best cross-correlation gain (CG) for signal
transmission. The minimal FI is unity, corresponding to a Gaussian PDF, whereas
the FI is certainly larger than unity for any non-Gaussian PDFs. In the sense
of a realizable LOP, it is found that the dichotomous noise PDF possesses an
infinite FI for known weak signals perfectly processed by the corresponding
LOP. The significance of FI lies in that it provides a upper bound for the
performance of locally optimum processing.Comment: 8 pages, 1 figur