27,297 research outputs found

    Kolmogorov Complexity of Categories

    Full text link
    Kolmogorov complexity theory is used to tell what the algorithmic informational content of a string is. It is defined as the length of the shortest program that describes the string. We present a programming language that can be used to describe categories, functors, and natural transformations. With this in hand, we define the informational content of these categorical structures as the shortest program that describes such structures. Some basic consequences of our definition are presented including the fact that equivalent categories have equal Kolmogorov complexity. We also prove different theorems about what can and cannot be described by our programming language.Comment: 16 page

    Zipf's law and L. Levin's probability distributions

    Full text link
    Zipf's law in its basic incarnation is an empirical probability distribution governing the frequency of usage of words in a language. As Terence Tao recently remarked, it still lacks a convincing and satisfactory mathematical explanation. In this paper I suggest that at least in certain situations, Zipf's law can be explained as a special case of the a priori distribution introduced and studied by L. Levin. The Zipf ranking corresponding to diminishing probability appears then as the ordering determined by the growing Kolmogorov complexity. One argument justifying this assertion is the appeal to a recent interpretation by Yu. Manin and M. Marcolli of asymptotic bounds for error--correcting codes in terms of phase transition. In the respective partition function, Kolmogorov complexity of a code plays the role of its energy. This version contains minor corrections and additions.Comment: 19 page

    LT2C2: A language of thought with Turing-computable Kolmogorov complexity

    Get PDF
    In this paper, we present a theoretical effort to connect the theory of program size to psychology by implementing a concrete language of thought with Turing-computable Kolmogorov complexity ( ) satisfying the following requirements: 1) to be simple enough so that the complexity of any given finite binary sequence can be computed, 2) to be based on tangible operations of human reasoning (printing, repeating,. . . ), 3) to be sufficiently powerful to generate all possible sequences but not too powerful as to identify regularities which would be invisible to humans. We first formalize , giving its syntax and semantics, and defining an adequate notion of program size. Our setting leads to a Kolmogorov complexity function relative to which is computable in polynomial time, and it also induces a prediction algorithm in the spirit of Solomonoff’s inductive inference theory. We then prove the efficacy of this language by investigating regularities in strings produced by participants attempting to generate random strings. Participants had a profound understanding of randomness and hence avoided typical misconceptions such as exaggerating the number of alternations. We reasoned that remaining regularities would express the algorithmic nature of human thoughts, revealed in the form of specific patterns. Kolmogorov complexity relative to passed three expected tests examined here: 1) human sequences were less complex than control PRNG sequences, 2) human sequences were not stationary showing decreasing values of complexity resulting from fatigue 3) each individual showed traces of algorithmic stability since fitting of partial data was more effective to predict subsequent data than average fits. This work extends on previous efforts to combine notions of Kolmogorov complexity theory and algorithmic information theory to psychology, by explicitly proposing a language which may describe the patterns of human thoughts.Fil: Romano, Sergio Gaston. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Computación; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; ArgentinaFil: Sigman, Mariano. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Ciudad Universitaria. Instituto de Física de Buenos Aires; Argentina. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Neurociencia Integrativa; ArgentinaFil: Figueira, Santiago. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Computación; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentin

    The structure of a logarithmic advice class

    Get PDF
    The complexity class Full-P / log, corresponding to a form of logarithmic advice for polynomial time, is studied. In order to understand the inner structure of this class, we characterize Full-P /log in terms of Turing reducibility to a special family of sparse sets. Other characterizations of Full-P / log, relating it to sets with small information content, were already known. These used tally sets whose words follow a given regular pattern and tally sets that are regular in a resource-bounded Kolmogorov complexity sense. We obtain here relationships between the equivalence classes of the mentioned tally and sparse sets under various reducibiities, which provide new knowledge about the logarithmic advice class. Another way to measure the amount of information encoded in a language in a nonuniform class, is to study the relative complexity of computing advice functions for this language. We prove bounds on the complexity of ad vice functions for Full-P / log and for other subclasses of it. As a consequence, Full-P / log is located in the Extended Low Hierarchy

    The use of ideas of Information Theory for studying "language" and intelligence in ants

    Full text link
    In this review we integrate results of long term experimental study on ant "language" and intelligence which were fully based on fundamental ideas of Information Theory, such as the Shannon entropy, the Kolmogorov complexity, and the Shannon's equation connecting the length of a message (ll) and its frequency (p)(p), i.e. l=logpl = - \log p for rational communication systems. This approach, new for studying biological communication systems, enabled us to obtain the following important results on ants' communication and intelligence: i) to reveal "distant homing" in ants, that is, their ability to transfer information about remote events; ii) to estimate the rate of information transmission; iii) to reveal that ants are able to grasp regularities and to use them for "compression" of information; iv) to reveal that ants are able to transfer to each other the information about the number of objects; v) to discover that ants can add and subtract small numbers. The obtained results show that Information Theory is not only wonderful mathematical theory, but many its results may be considered as Nature laws

    Kolmogorov Complexity in perspective. Part II: Classification, Information Processing and Duality

    Get PDF
    We survey diverse approaches to the notion of information: from Shannon entropy to Kolmogorov complexity. Two of the main applications of Kolmogorov complexity are presented: randomness and classification. The survey is divided in two parts published in a same volume. Part II is dedicated to the relation between logic and information system, within the scope of Kolmogorov algorithmic information theory. We present a recent application of Kolmogorov complexity: classification using compression, an idea with provocative implementation by authors such as Bennett, Vitanyi and Cilibrasi. This stresses how Kolmogorov complexity, besides being a foundation to randomness, is also related to classification. Another approach to classification is also considered: the so-called "Google classification". It uses another original and attractive idea which is connected to the classification using compression and to Kolmogorov complexity from a conceptual point of view. We present and unify these different approaches to classification in terms of Bottom-Up versus Top-Down operational modes, of which we point the fundamental principles and the underlying duality. We look at the way these two dual modes are used in different approaches to information system, particularly the relational model for database introduced by Codd in the 70's. This allows to point out diverse forms of a fundamental duality. These operational modes are also reinterpreted in the context of the comprehension schema of axiomatic set theory ZF. This leads us to develop how Kolmogorov's complexity is linked to intensionality, abstraction, classification and information system.Comment: 43 page

    The Equivalence of Sampling and Searching

    Get PDF
    In a sampling problem, we are given an input x, and asked to sample approximately from a probability distribution D_x. In a search problem, we are given an input x, and asked to find a member of a nonempty set A_x with high probability. (An example is finding a Nash equilibrium.) In this paper, we use tools from Kolmogorov complexity and algorithmic information theory to show that sampling and search problems are essentially equivalent. More precisely, for any sampling problem S, there exists a search problem R_S such that, if C is any "reasonable" complexity class, then R_S is in the search version of C if and only if S is in the sampling version. As one application, we show that SampP=SampBQP if and only if FBPP=FBQP: in other words, classical computers can efficiently sample the output distribution of every quantum circuit, if and only if they can efficiently solve every search problem that quantum computers can solve. A second application is that, assuming a plausible conjecture, there exists a search problem R that can be solved using a simple linear-optics experiment, but that cannot be solved efficiently by a classical computer unless the polynomial hierarchy collapses. That application will be described in a forthcoming paper with Alex Arkhipov on the computational complexity of linear optics.Comment: 16 page

    Unexpected Power of Random Strings

    Get PDF
    corecore