2 research outputs found

    Information geometry metric for random signal detection in large random sensing systems

    Get PDF
    International audienceAssume that a N-dimensional noisy measurement vector is available via a N × R linear random sensing operation of a R-dimensional Gaussian signal of interest, denoted by s. The problem statement being addressed here is the study of the minimal Bayes' error probability for the detection of s where N → ∞ with N/R → β ∈ (1, ∞). When the exact derivation of this probability is intractable, statistical similarity metrics, nourishing their roots in the information geometry theory, are useful to characterize the exponential rate of the error probability. More precisely, the Chernoff information is asymptot-ically given by the minimum over s ∈ (0, 1) of the s-divergence. In many applications, it is hard to evaluate the s-divergence. Worse, due to the asymmetry of the s-divergence for the considered detection problem, the Bhattacharyya divergence (s = 1/2), cannot circumvent this problem. As a consequence, the derivation of the optimal value of s requires a costly numerical optimization strategy. In this work, we propose two contributions. The first one is to provide a closed-form expression of the asymptotic normalized s-divergence. The second contribution is to provide an analytic expression for the optimal value of s
    corecore