4,711 research outputs found

    Mismatched Quantum Filtering and Entropic Information

    Full text link
    Quantum filtering is a signal processing technique that estimates the posterior state of a quantum system under continuous measurements and has become a standard tool in quantum information processing, with applications in quantum state preparation, quantum metrology, and quantum control. If the filter assumes a nominal model that differs from reality, however, the estimation accuracy is bound to suffer. Here I derive identities that relate the excess error caused by quantum filter mismatch to the relative entropy between the true and nominal observation probability measures, with one identity for Gaussian measurements, such as optical homodyne detection, and another for Poissonian measurements, such as photon counting. These identities generalize recent seminal results in classical information theory and provide new operational meanings to relative entropy, mutual information, and channel capacity in the context of quantum experiments.Comment: v1: first draft, 8 pages, v2: added introduction and more results on mutual information and channel capacity, 12 pages, v3: minor updates, v4: updated the presentatio

    Generalized Bregman Divergence and Gradient of Mutual Information for Vector Poisson Channels

    Full text link
    We investigate connections between information-theoretic and estimation-theoretic quantities in vector Poisson channel models. In particular, we generalize the gradient of mutual information with respect to key system parameters from the scalar to the vector Poisson channel model. We also propose, as another contribution, a generalization of the classical Bregman divergence that offers a means to encapsulate under a unifying framework the gradient of mutual information results for scalar and vector Poisson and Gaussian channel models. The so-called generalized Bregman divergence is also shown to exhibit various properties akin to the properties of the classical version. The vector Poisson channel model is drawing considerable attention in view of its application in various domains: as an example, the availability of the gradient of mutual information can be used in conjunction with gradient descent methods to effect compressive-sensing projection designs in emerging X-ray and document classification applications

    Information theoretic approach for assessing image fidelity in photon-counting arrays

    Get PDF
    The method of photon-counting integral imaging has been introduced recently for three-dimensional object sensing, visualization, recognition and classification of scenes under photon-starved conditions. This paper presents an information-theoretic model for the photon-counting imaging (PCI) method, thereby providing a rigorous foundation for the merits of PCI in terms of image fidelity. This, in turn, can facilitate our understanding of the demonstrated success of photon-counting integral imaging in compressive imaging and classification. The mutual information between the source and photon-counted images is derived in a Markov random field setting and normalized by the source-image’s entropy, yielding a fidelity metric that is between zero and unity, which respectively corresponds to complete loss of information and full preservation of information. Calculations suggest that the PCI fidelity metric increases with spatial correlation in source image, from which we infer that the PCI method is particularly effective for source images with high spatial correlation; the metric also increases with the reduction in photon-number uncertainty. As an application to the theory, an image-classification problem is considered showing a congruous relationship between the fidelity metric and classifier’s performance
    • …
    corecore