2,392,376 research outputs found

    Shannon Information and Kolmogorov Complexity

    Full text link
    We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and where they are fundamentally different. We discuss and relate the basic notions of both theories: Shannon entropy versus Kolmogorov complexity, the relation of both to universal coding, Shannon mutual information versus Kolmogorov (`algorithmic') mutual information, probabilistic sufficient statistic versus algorithmic sufficient statistic (related to lossy compression in the Shannon theory versus meaningful information in the Kolmogorov theory), and rate distortion theory versus Kolmogorov's structure function. Part of the material has appeared in print before, scattered through various publications, but this is the first comprehensive systematic comparison. The last mentioned relations are new.Comment: Survey, LaTeX 54 pages, 3 figures, Submitted to IEEE Trans Information Theor

    Entropy, complexity, and spatial information

    Get PDF
    We pose the central problem of defining a measure of complexity, specifically for spatial systems in general, city systems in particular. The measures we adopt are based on Shannon's (in Bell Syst Tech J 27:379-423, 623-656, 1948) definition of information. We introduce this measure and argue that increasing information is equivalent to increasing complexity, and we show that for spatial distributions, this involves a trade-off between the density of the distribution and the number of events that characterize it; as cities get bigger and are characterized by more events-more places or locations, information increases, all other things being equal. But sometimes the distribution changes at a faster rate than the number of events and thus information can decrease even if a city grows. We develop these ideas using various information measures. We first demonstrate their applicability to various distributions of population in London over the last 100Ā years, then to a wider region of London which is divided into bands of zones at increasing distances from the core, and finally to the evolution of the street system that characterizes the built-up area of London from 1786 to the present day. We conclude by arguing that we need to relate these measures to other measures of complexity, to choose a wider array of examples, and to extend the analysis to two-dimensional spatial systems

    Quantum Information Complexity and Amortized Communication

    Full text link
    We define a new notion of information cost for quantum protocols, and a corresponding notion of quantum information complexity for bipartite quantum channels, and then investigate the properties of such quantities. These are the fully quantum generalizations of the analogous quantities for bipartite classical functions that have found many applications recently, in particular for proving communication complexity lower bounds. Our definition is strongly tied to the quantum state redistribution task. Previous attempts have been made to define such a quantity for quantum protocols, with particular applications in mind; our notion differs from these in many respects. First, it directly provides a lower bound on the quantum communication cost, independent of the number of rounds of the underlying protocol. Secondly, we provide an operational interpretation for quantum information complexity: we show that it is exactly equal to the amortized quantum communication complexity of a bipartite channel on a given state. This generalizes a result of Braverman and Rao to quantum protocols, and even strengthens the classical result in a bounded round scenario. Also, this provides an analogue of the Schumacher source compression theorem for interactive quantum protocols, and answers a question raised by Braverman. We also discuss some potential applications to quantum communication complexity lower bounds by specializing our definition for classical functions and inputs. Building on work of Jain, Radhakrishnan and Sen, we provide new evidence suggesting that the bounded round quantum communication complexity of the disjointness function is \Omega (n/M + M), for M-message protocols. This would match the best known upper bound.Comment: v1, 38 pages, 1 figur

    Information theory, complexity and neural networks

    Get PDF
    Some of the main results in the mathematical evaluation of neural networks as information processing systems are discussed. The basic operation of feedback and feed-forward neural networks is described. Their memory capacity and computing power are considered. The concept of learning by example as it applies to neural networks is examined
    • ā€¦
    corecore