2,623 research outputs found

    Informational Divergence Approximations to Product Distributions

    Full text link
    The minimum rate needed to accurately approximate a product distribution based on an unnormalized informational divergence is shown to be a mutual information. This result subsumes results of Wyner on common information and Han-Verd\'{u} on resolvability. The result also extends to cases where the source distribution is unknown but the entropy is known

    Measuring Shared Information and Coordinated Activity in Neuronal Networks

    Get PDF
    Most nervous systems encode information about stimuli in the responding activity of large neuronal networks. This activity often manifests itself as dynamically coordinated sequences of action potentials. Since multiple electrode recordings are now a standard tool in neuroscience research, it is important to have a measure of such network-wide behavioral coordination and information sharing, applicable to multiple neural spike train data. We propose a new statistic, informational coherence, which measures how much better one unit can be predicted by knowing the dynamical state of another. We argue informational coherence is a measure of association and shared information which is superior to traditional pairwise measures of synchronization and correlation. To find the dynamical states, we use a recently-introduced algorithm which reconstructs effective state spaces from stochastic time series. We then extend the pairwise measure to a multivariate analysis of the network by estimating the network multi-information. We illustrate our method by testing it on a detailed model of the transition from gamma to beta rhythms.Comment: 8 pages, 6 figure

    Resolvability on Continuous Alphabets

    Full text link
    We characterize the resolvability region for a large class of point-to-point channels with continuous alphabets. In our direct result, we prove not only the existence of good resolvability codebooks, but adapt an approach based on the Chernoff-Hoeffding bound to the continuous case showing that the probability of drawing an unsuitable codebook is doubly exponentially small. For the converse part, we show that our previous elementary result carries over to the continuous case easily under some mild continuity assumption.Comment: v2: Corrected inaccuracies in proof of direct part. Statement of Theorem 3 slightly adapted; other results unchanged v3: Extended version of camera ready version submitted to ISIT 201

    Measurement uncertainty relations for position and momentum: Relative entropy formulation

    Get PDF
    Heisenberg's uncertainty principle has recently led to general measurement uncertainty relations for quantum systems: incompatible observables can be measured jointly or in sequence only with some unavoidable approximation, which can be quantified in various ways. The relative entropy is the natural theoretical quantifier of the information loss when a `true' probability distribution is replaced by an approximating one. In this paper, we provide a lower bound for the amount of information that is lost by replacing the distributions of the sharp position and momentum observables, as they could be obtained with two separate experiments, by the marginals of any smeared joint measurement. The bound is obtained by introducing an entropic error function, and optimizing it over a suitable class of covariant approximate joint measurements. We fully exploit two cases of target observables: (1) nn-dimensional position and momentum vectors; (2) two components of position and momentum along different directions. In (1), we connect the quantum bound to the dimension nn; in (2), going from parallel to orthogonal directions, we show the transition from highly incompatible observables to compatible ones. For simplicity, we develop the theory only for Gaussian states and measurements.Comment: 33 page

    Information processing and the second law of thermodynamics: an inclusive, Hamiltonian approach

    Full text link
    We obtain generalizations of the Kelvin-Planck, Clausius, and Carnot statements of the second law of thermodynamics, for situations involving information processing. To this end, we consider an information reservoir (representing, e.g. a memory device) alongside the heat and work reservoirs that appear in traditional thermodynamic analyses. We derive our results within an inclusive framework in which all participating elements -- the system or device of interest, together with the heat, work and information reservoirs -- are modeled explicitly by a time-independent, classical Hamiltonian. We place particular emphasis on the limits and assumptions under which cyclic motion of the device of interest emerges from its interactions with work, heat, and information reservoirs.Comment: 14 pages, 4 figure
    • …
    corecore