6 research outputs found

    On the Equivalence of f-Divergence Balls and Density Bands in Robust Detection

    Full text link
    The paper deals with minimax optimal statistical tests for two composite hypotheses, where each hypothesis is defined by a non-parametric uncertainty set of feasible distributions. It is shown that for every pair of uncertainty sets of the f-divergence ball type, a pair of uncertainty sets of the density band type can be constructed, which is equivalent in the sense that it admits the same pair of least favorable distributions. This result implies that robust tests under ff-divergence ball uncertainty, which are typically only minimax optimal for the single sample case, are also fixed sample size minimax optimal with respect to the equivalent density band uncertainty sets.Comment: 5 pages, 1 figure, accepted for publication in the Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) 201

    ROBUST KULLBACK-LEIBLER DIVERGENCE AND ITS APPLICATIONS IN UNIVERSAL HYPOTHESIS TESTING AND DEVIATION DETECTION

    Get PDF
    The Kullback-Leibler (KL) divergence is one of the most fundamental metrics in information theory and statistics and provides various operational interpretations in the context of mathematical communication theory and statistical hypothesis testing. The KL divergence for discrete distributions has the desired continuity property which leads to some fundamental results in universal hypothesis testing. With continuous observations, however, the KL divergence is only lower semi-continuous; difficulties arise when tackling universal hypothesis testing with continuous observations due to the lack of continuity in KL divergence. This dissertation proposes a robust version of the KL divergence for continuous alphabets. Specifically, the KL divergence defined from a distribution to the Levy ball centered at the other distribution is found to be continuous. This robust version of the KL divergence allows one to generalize the result in universal hypothesis testing for discrete alphabets to that for continuous observations. The optimal decision rule is developed whose robust property is provably established for universal hypothesis testing. Another application of the robust KL divergence is in deviation detection: the problem of detecting deviation from a nominal distribution using a sequence of independent and identically distributed observations. An asymptotically -optimal detector is then developed for deviation detection where the Levy metric becomes a very natural distance measure for deviation from the nominal distribution. Lastly, the dissertation considers the following variation of a distributed detection problem: a sensor may overhear other sensors\u27 transmissions and thus may choose to refine its output in the hope of achieving a better detection performance. While this is shown to be possible for the fixed sample size test, asymptotically (in the number of samples) there is no performance gain, as measured by the KL divergence achievable at the fusion center, provided that the observations are conditionally independent. For conditionally dependent observations, however, asymptotic detection performance may indeed be improved when overhearing is utilized

    Design and Analysis of Optimal and Minimax Robust Sequential Hypothesis Tests

    Get PDF
    In this dissertation a framework for the design and analysis of optimal and minimax robust sequential hypothesis tests is developed. It provides a coherent theory as well as algorithms for the implementation of optimal and minimax robust sequential tests in practice. After introducing some fundamental concepts of sequential analysis and optimal stopping theory, the optimal sequential test for stochastic processes with Markovian representations is derived. This is done by formulating the sequential testing problem as an optimal stopping problem whose cost function is given by a weighted sum of the expected run-length and the error probabilities of the test. Based on this formulation, a cost minimizing testing policy can be obtained by solving a nonlinear integral equation. It is then shown that the partial generalized derivatives of the optimal cost function are, up to a constant scaling factor, identical to the error probabilities of the cost minimizing test. This relation is used to formulate the problem of designing optimal sequential tests under constraints on the error probabilities as a problem of solving an integral equation under constraints on the partial derivatives of its solution function. Finally, it is shown that the latter problem can be solved by means of standard linear programming techniques without the need to calculate the partial derivatives explicitly. Numerical examples are given to illustrate this procedure. The second half of the dissertation is concerned with the design of minimax robust sequential hypothesis tests. First, the minimax principle and a general model for distributional uncertainties is introduced. Subsequently, sufficient conditions are derived for distributions to be least favorable with respect to the expected run-length and error probabilities of a sequential test. Combining the results on optimal sequential tests and least favorable distributions yields a sufficient condition for a sequential test to be minimax optimal under general distributional uncertainties. The cost function of the minimax optimal test is further identified as a convex statistical similarity measure and the least favorable distributions as the distributions that are most similar with respect to this measure. In order to obtain more specific results, the density band model is introduced as an example for a nonparametric uncertainty model. The corresponding least favorable distributions are stated in an implicit form, based on which a simple algorithm for their numerical calculation is derived. Finally, the minimax robust sequential test under density band uncertainties is discussed and shown to admit the characteristic minimax property of a maximally flat performance profile over its state space. A numerical example for a minimax optimal sequential test completes the dissertation

    Robust Hypothesis Testing With α-divergence Distance

    No full text
    corecore