44,002 research outputs found

    Information bounds and quickest change detection in decentralized decision systems

    Get PDF
    The quickest change detection problem is studied in decentralized decision systems, where a set of sensors receive independent observations and send summary messages to the fusion center, which makes a final decision. In the system where the sensors do not have access to their past observations, the previously conjectured asymptotic optimality of a procedure with a monotone likelihood ratio quantizer (MLRQ) is proved. In the case of additive Gaussian sensor noise, if the signal-to-noise ratios (SNR) at some sensors are sufficiently high, this procedure can perform as well as the optimal centralized procedure that has access to all the sensor observations. Even if all SNRs are low, its detection delay will be at most pi/2-1 approximate to 57% larger than that of the optimal centralized procedure. Next, in the system where the sensors have full access to their past observations, the first asymptotically optimal procedure in the literature is developed. Surprisingly, the procedure has the same asymptotic performance as the optimal centralized procedure, although it may perform poorly in some practical situations because of slow asymptotic convergence. Finally, it is shown that neither past message information nor the feedback from the fusion center improves the asymptotic performance in the simplest model

    Hypothesis Testing in Feedforward Networks with Broadcast Failures

    Full text link
    Consider a countably infinite set of nodes, which sequentially make decisions between two given hypotheses. Each node takes a measurement of the underlying truth, observes the decisions from some immediate predecessors, and makes a decision between the given hypotheses. We consider two classes of broadcast failures: 1) each node broadcasts a decision to the other nodes, subject to random erasure in the form of a binary erasure channel; 2) each node broadcasts a randomly flipped decision to the other nodes in the form of a binary symmetric channel. We are interested in whether there exists a decision strategy consisting of a sequence of likelihood ratio tests such that the node decisions converge in probability to the underlying truth. In both cases, we show that if each node only learns from a bounded number of immediate predecessors, then there does not exist a decision strategy such that the decisions converge in probability to the underlying truth. However, in case 1, we show that if each node learns from an unboundedly growing number of predecessors, then the decisions converge in probability to the underlying truth, even when the erasure probabilities converge to 1. We also derive the convergence rate of the error probability. In case 2, we show that if each node learns from all of its previous predecessors, then the decisions converge in probability to the underlying truth when the flipping probabilities of the binary symmetric channels are bounded away from 1/2. In the case where the flipping probabilities converge to 1/2, we derive a necessary condition on the convergence rate of the flipping probabilities such that the decisions still converge to the underlying truth. We also explicitly characterize the relationship between the convergence rate of the error probability and the convergence rate of the flipping probabilities

    Maximum-Likelihood Sequence Detector for Dynamic Mode High Density Probe Storage

    Get PDF
    There is an increasing need for high density data storage devices driven by the increased demand of consumer electronics. In this work, we consider a data storage system that operates by encoding information as topographic profiles on a polymer medium. A cantilever probe with a sharp tip (few nm radius) is used to create and sense the presence of topographic profiles, resulting in a density of few Tb per in.2. The prevalent mode of using the cantilever probe is the static mode that is harsh on the probe and the media. In this article, the high quality factor dynamic mode operation, that is less harsh on the media and the probe, is analyzed. The read operation is modeled as a communication channel which incorporates system memory due to inter-symbol interference and the cantilever state. We demonstrate an appropriate level of abstraction of this complex nanoscale system that obviates the need for an involved physical model. Next, a solution to the maximum likelihood sequence detection problem based on the Viterbi algorithm is devised. Experimental and simulation results demonstrate that the performance of this detector is several orders of magnitude better than the performance of other existing schemes.Comment: This paper is published in IEEE Trans. on communicatio

    On optimal quantization rules for some problems in sequential decentralized detection

    Full text link
    We consider the design of systems for sequential decentralized detection, a problem that entails several interdependent choices: the choice of a stopping rule (specifying the sample size), a global decision function (a choice between two competing hypotheses), and a set of quantization rules (the local decisions on the basis of which the global decision is made). This paper addresses an open problem of whether in the Bayesian formulation of sequential decentralized detection, optimal local decision functions can be found within the class of stationary rules. We develop an asymptotic approximation to the optimal cost of stationary quantization rules and exploit this approximation to show that stationary quantizers are not optimal in a broad class of settings. We also consider the class of blockwise stationary quantizers, and show that asymptotically optimal quantizers are likelihood-based threshold rules.Comment: Published as IEEE Transactions on Information Theory, Vol. 54(7), 3285-3295, 200

    Hierarchical Bayesian Detection Algorithm for Early-Universe Relics in the Cosmic Microwave Background

    Full text link
    A number of theoretically well-motivated additions to the standard cosmological model predict weak signatures in the form of spatially localized sources embedded in the cosmic microwave background (CMB) fluctuations. We present a hierarchical Bayesian statistical formalism and a complete data analysis pipeline for testing such scenarios. We derive an accurate approximation to the full posterior probability distribution over the parameters defining any theory that predicts sources embedded in the CMB, and perform an extensive set of tests in order to establish its validity. The approximation is implemented using a modular algorithm, designed to avoid a posteriori selection effects, which combines a candidate-detection stage with a full Bayesian model-selection and parameter-estimation analysis. We apply this pipeline to theories that predict cosmic textures and bubble collisions, extending previous analyses by using: (1) adaptive-resolution techniques, allowing us to probe features of arbitrary size, and (2) optimal filters, which provide the best possible sensitivity for detecting candidate signatures. We conclude that the WMAP 7-year data do not favor the addition of either cosmic textures or bubble collisions to the standard cosmological model, and place robust constraints on the predicted number of such sources. The expected numbers of bubble collisions and cosmic textures on the CMB sky within our detection thresholds are constrained to be fewer than 4.0 and 5.2 at 95% confidence, respectively.Comment: 34 pages, 18 figures. v3: corrected very minor typos to match published versio
    • …
    corecore