142 research outputs found

    Shared Information -- New Insights and Problems in Decomposing Information in Complex Systems

    Full text link
    How can the information that a set X1,...,Xn{X_{1},...,X_{n}} of random variables contains about another random variable SS be decomposed? To what extent do different subgroups provide the same, i.e. shared or redundant, information, carry unique information or interact for the emergence of synergistic information? Recently Williams and Beer proposed such a decomposition based on natural properties for shared information. While these properties fix the structure of the decomposition, they do not uniquely specify the values of the different terms. Therefore, we investigate additional properties such as strong symmetry and left monotonicity. We find that strong symmetry is incompatible with the properties proposed by Williams and Beer. Although left monotonicity is a very natural property for an information measure it is not fulfilled by any of the proposed measures. We also study a geometric framework for information decompositions and ask whether it is possible to represent shared information by a family of posterior distributions. Finally, we draw connections to the notions of shared knowledge and common knowledge in game theory. While many people believe that independent variables cannot share information, we show that in game theory independent agents can have shared knowledge, but not common knowledge. We conclude that intuition and heuristic arguments do not suffice when arguing about information.Comment: 20 page

    Determining Principal Component Cardinality through the Principle of Minimum Description Length

    Full text link
    PCA (Principal Component Analysis) and its variants areubiquitous techniques for matrix dimension reduction and reduced-dimensionlatent-factor extraction. One significant challenge in using PCA, is thechoice of the number of principal components. The information-theoreticMDL (Minimum Description Length) principle gives objective compression-based criteria for model selection, but it is difficult to analytically applyits modern definition - NML (Normalized Maximum Likelihood) - to theproblem of PCA. This work shows a general reduction of NML prob-lems to lower-dimension problems. Applying this reduction, it boundsthe NML of PCA, by terms of the NML of linear regression, which areknown.Comment: LOD 201

    Predicting sentence translation quality using extrinsic and language independent features

    Get PDF
    We develop a top performing model for automatic, accurate, and language independent prediction of sentence-level statistical machine translation (SMT) quality with or without looking at the translation outputs. We derive various feature functions measuring the closeness of a given test sentence to the training data and the difficulty of translating the sentence. We describe \texttt{mono} feature functions that are based on statistics of only one side of the parallel training corpora and \texttt{duo} feature functions that incorporate statistics involving both source and target sides of the training data. Overall, we describe novel, language independent, and SMT system extrinsic features for predicting the SMT performance, which also rank high during feature ranking evaluations. We experiment with different learning settings, with or without looking at the translations, which help differentiate the contribution of different feature sets. We apply partial least squares and feature subset selection, both of which improve the results and we present ranking of the top features selected for each learning setting, providing an exhaustive analysis of the extrinsic features used. We show that by just looking at the test source sentences and not using the translation outputs at all, we can achieve better performance than a baseline system using SMT model dependent features that generated the translations. Furthermore, our prediction system is able to achieve the 22nd best performance overall according to the official results of the Quality Estimation Task (QET) challenge when also looking at the translation outputs. Our representation and features achieve the top performance in QET among the models using the SVR learning model

    Direct entropy determination and application to artificial spin ice

    Full text link
    From thermodynamic origins, the concept of entropy has expanded to a range of statistical measures of uncertainty, which may still be thermodynamically significant. However, laboratory measurements of entropy continue to rely on direct measurements of heat. New technologies that can map out myriads of microscopic degrees of freedom suggest direct determination of configurational entropy by counting in systems where it is thermodynamically inaccessible, such as granular and colloidal materials, proteins and lithographically fabricated nanometre-scale arrays. Here, we demonstrate a conditional-probability technique to calculate entropy densities of translation-invariant states on lattices using limited configuration data on small clusters, and apply it to arrays of interacting nanometre-scale magnetic islands (artificial spin ice). Models for statistically disordered systems can be assessed by applying the method to relative entropy densities. For artificial spin ice, this analysis shows that nearest-neighbour correlations drive longer-range ones.Comment: 10 page

    Optimal measurement of visual motion across spatial and temporal scales

    Full text link
    Sensory systems use limited resources to mediate the perception of a great variety of objects and events. Here a normative framework is presented for exploring how the problem of efficient allocation of resources can be solved in visual perception. Starting with a basic property of every measurement, captured by Gabor's uncertainty relation about the location and frequency content of signals, prescriptions are developed for optimal allocation of sensors for reliable perception of visual motion. This study reveals that a large-scale characteristic of human vision (the spatiotemporal contrast sensitivity function) is similar to the optimal prescription, and it suggests that some previously puzzling phenomena of visual sensitivity, adaptation, and perceptual organization have simple principled explanations.Comment: 28 pages, 10 figures, 2 appendices; in press in Favorskaya MN and Jain LC (Eds), Computer Vision in Advanced Control Systems using Conventional and Intelligent Paradigms, Intelligent Systems Reference Library, Springer-Verlag, Berli

    Quantum Communication

    Get PDF
    Quantum communication, and indeed quantum information in general, has changed the way we think about quantum physics. In 1984 and 1991, the first protocol for quantum cryptography and the first application of quantum non-locality, respectively, attracted a diverse field of researchers in theoretical and experimental physics, mathematics and computer science. Since then we have seen a fundamental shift in how we understand information when it is encoded in quantum systems. We review the current state of research and future directions in this new field of science with special emphasis on quantum key distribution and quantum networks.Comment: Submitted version, 8 pg (2 cols) 5 fig

    Interrupting peptidoglycan deacetylation during Bdellovibrio predator-prey interaction prevents ultimate destruction of prey wall, liberating bacterial-ghosts

    Get PDF
    The peptidoglycan wall, located in the periplasm between the inner and outer membranes of the cell envelope in Gram-negative bacteria, maintains cell shape and endows osmotic robustness. Predatory Bdellovibrio bacteria invade the periplasm of other bacterial prey cells, usually crossing the peptidoglycan layer, forming transient structures called bdelloplasts within which the predators replicate. Prey peptidoglycan remains intact for several hours, but is modified and then degraded by predators escaping. Here we show predation is altered by deleting two Bdellovibrio N-acetylglucosamine (GlcNAc) deacetylases, one of which we show to have a unique two domain structure with a novel regulatory-”plug”. Deleting the deacetylases limits peptidoglycan degradation and rounded prey cell “ghosts” persist after mutant-predator exit. Mutant predators can replicate unusually in the periplasmic region between the peptidoglycan wall and the outer membrane rather than between wall and inner-membrane, yet still obtain nutrients from the prey cytoplasm. Deleting two further genes encoding DacB/PBP4 family proteins, known to decrosslink and round prey peptidoglycan, results in a quadruple mutant Bdellovibrio which leaves prey-shaped ghosts upon predation. The resultant bacterial ghosts contain cytoplasmic membrane within bacteria-shaped peptidoglycan surrounded by outer membrane material which could have promise as “bacterial skeletons” for housing artificial chromosomes

    Adaptation and Selective Information Transmission in the Cricket Auditory Neuron AN2

    Get PDF
    Sensory systems adapt their neural code to changes in the sensory environment, often on multiple time scales. Here, we report a new form of adaptation in a first-order auditory interneuron (AN2) of crickets. We characterize the response of the AN2 neuron to amplitude-modulated sound stimuli and find that adaptation shifts the stimulus–response curves toward higher stimulus intensities, with a time constant of 1.5 s for adaptation and recovery. The spike responses were thus reduced for low-intensity sounds. We then address the question whether adaptation leads to an improvement of the signal's representation and compare the experimental results with the predictions of two competing hypotheses: infomax, which predicts that information conveyed about the entire signal range should be maximized, and selective coding, which predicts that “foreground” signals should be enhanced while “background” signals should be selectively suppressed. We test how adaptation changes the input–response curve when presenting signals with two or three peaks in their amplitude distributions, for which selective coding and infomax predict conflicting changes. By means of Bayesian data analysis, we quantify the shifts of the measured response curves and also find a slight reduction of their slopes. These decreases in slopes are smaller, and the absolute response thresholds are higher than those predicted by infomax. Most remarkably, and in contrast to the infomax principle, adaptation actually reduces the amount of encoded information when considering the whole range of input signals. The response curve changes are also not consistent with the selective coding hypothesis, because the amount of information conveyed about the loudest part of the signal does not increase as predicted but remains nearly constant. Less information is transmitted about signals with lower intensity

    Task-Specific Codes for Face Recognition: How they Shape the Neural Representation of Features for Detection and Individuation

    Get PDF
    The variety of ways in which faces are categorized makes face recognition challenging for both synthetic and biological vision systems. Here we focus on two face processing tasks, detection and individuation, and explore whether differences in task demands lead to differences both in the features most effective for automatic recognition and in the featural codes recruited by neural processing.Our study appeals to a computational framework characterizing the features representing object categories as sets of overlapping image fragments. Within this framework, we assess the extent to which task-relevant information differs across image fragments. Based on objective differences we find among task-specific representations, we test the sensitivity of the human visual system to these different face descriptions independently of one another. Both behavior and functional magnetic resonance imaging reveal effects elicited by objective task-specific levels of information. Behaviorally, recognition performance with image fragments improves with increasing task-specific information carried by different face fragments. Neurally, this sensitivity to the two tasks manifests as differential localization of neural responses across the ventral visual pathway. Fragments diagnostic for detection evoke larger neural responses than non-diagnostic ones in the right posterior fusiform gyrus and bilaterally in the inferior occipital gyrus. In contrast, fragments diagnostic for individuation evoke larger responses than non-diagnostic ones in the anterior inferior temporal gyrus. Finally, for individuation only, pattern analysis reveals sensitivity to task-specific information within the right "fusiform face area".OUR RESULTS DEMONSTRATE: 1) information diagnostic for face detection and individuation is roughly separable; 2) the human visual system is independently sensitive to both types of information; 3) neural responses differ according to the type of task-relevant information considered. More generally, these findings provide evidence for the computational utility and the neural validity of fragment-based visual representation and recognition
    corecore