1,895 research outputs found

    Common Information and Decentralized Inference with Dependent Observations

    Get PDF
    Wyner\u27s common information was originally defined for a pair of dependent discrete random variables. This thesis generalizes its definition in two directions: the number of dependent variables can be arbitrary, so are the alphabets of those random variables. New properties are determined for the generalized Wyner\u27s common information of multiple dependent variables. More importantly, a lossy source coding interpretation of Wyner\u27s common information is developed using the Gray-Wyner network. It is established that the common information equals to the smallest common message rate when the total rate is arbitrarily close to the rate distortion function with joint decoding if the distortions are within some distortion region. The application of Wyner\u27s common information to inference problems is also explored in the thesis. A central question is under what conditions does Wyner\u27s common information capture the entire information about the inference object. Under a simple Bayesian model, it is established that for infinitely exchangeable random variables that the common information is asymptotically equal to the information of the inference object. For finite exchangeable random variables, connection between common information and inference performance metrics are also established. The problem of decentralized inference is generally intractable with conditional dependent observations. A promising approach for this problem is to utilize a hierarchical conditional independence model. Utilizing the hierarchical conditional independence model, we identify a more general condition under which the distributed detection problem becomes tractable, thereby broadening the classes of distributed detection problems with dependent observations that can be readily solved. We then develop the sufficiency principle for data reduction for decentralized inference. For parallel networks, the hierarchical conditional independence model is used to obtain conditions such that local sufficiency implies global sufficiency. For tandem networks, the notion of conditional sufficiency is introduced and the related theory and tools are developed. Connections between the sufficiency principle and distributed source coding problems are also explored. Furthermore, we examine the impact of quantization on decentralized data reduction. The conditions under which sufficiency based data reduction with quantization constraints is optimal are identified. They include the case when the data at decentralized nodes are conditionally independent as well as a class of problems with conditionally dependent observations that admit conditional independence structure through the hierarchical conditional independence model

    Decentralized detection

    Get PDF
    Cover title. "To appear in Advances in Statistical Signal Processing, Vol. 2: Signal Detection, H.V. Poor and J.B. Thomas, Editors."--Cover.Includes bibliographical references (p. 40-43).Research supported by the ONR. N00014-84-K-0519 (NR 649-003) Research supported by the ARO. DAAL03-86-K-0171John N. Tsitsiklis

    Optimal Inference for Distributed Detection

    Get PDF
    In distributed detection, there does not exist an automatic way of generating optimal decision strategies for non-affine decision functions. Consequently, in a detection problem based on a non-affine decision function, establishing optimality of a given decision strategy, such as a generalized likelihood ratio test, is often difficult or even impossible. In this thesis we develop a novel detection network optimization technique that can be used to determine necessary and sufficient conditions for optimality in distributed detection for which the underlying objective function is monotonic and convex in probabilistic decision strategies. Our developed approach leverages on basic concepts of optimization and statistical inference which are provided in appendices in sufficient detail. These basic concepts are combined to form the basis of an optimal inference technique for signal detection. We prove a central theorem that characterizes optimality in a variety of distributed detection architectures. We discuss three applications of this result in distributed signal detection. These applications include interactive distributed detection, optimal tandem fusion architecture, and distributed detection by acyclic graph networks. In the conclusion we indicate several future research directions, which include possible generalizations of our optimization method and new research problems arising from each of the three applications considered

    Bayesian Design of Tandem Networks for Distributed Detection With Multi-Bit Sensor Decisions

    Full text link
    corecore