156,889 research outputs found

    An Introduction to Algorithmic Information Theory: Its History and Some Examples

    Get PDF
    The goal of this paper is to provide a simple introduction to Algorithmic Information Theory (AIT) that will highlight some of the main ideas without presenting too many details. More technical treatments of these ideas can be found in References [1], [2], [3] and [4], which are listed at the end of the paper. The main ideas of Algorithmic Information Theory will be presented using English as the underlying programming language. The presentation illustrates the fact that the same arguments can be expressed in any other reasonable language and that the main results have a robust universality across all reasonable languages. This paper grew out of a short course on AIT that Gregory Chaitin presented in June 1994, at the University of Maine. I helped with the course and observed some of the topics that proved most difficult for students. I presented a series of lectures based on these observations at the 1995 Summer School on Algorithmic Information Theory held in Mangalia Romania. The text of those lectures, and others from that workshop, can be found in The Journal of Universal Computer Science.[8] All the material presented here is based on the work of Gregory Chaitin

    Approaching the Rate-Distortion Limit with Spatial Coupling, Belief propagation and Decimation

    Get PDF
    We investigate an encoding scheme for lossy compression of a binary symmetric source based on simple spatially coupled Low-Density Generator-Matrix codes. The degree of the check nodes is regular and the one of code-bits is Poisson distributed with an average depending on the compression rate. The performance of a low complexity Belief Propagation Guided Decimation algorithm is excellent. The algorithmic rate-distortion curve approaches the optimal curve of the ensemble as the width of the coupling window grows. Moreover, as the check degree grows both curves approach the ultimate Shannon rate-distortion limit. The Belief Propagation Guided Decimation encoder is based on the posterior measure of a binary symmetric test-channel. This measure can be interpreted as a random Gibbs measure at a "temperature" directly related to the "noise level of the test-channel". We investigate the links between the algorithmic performance of the Belief Propagation Guided Decimation encoder and the phase diagram of this Gibbs measure. The phase diagram is investigated thanks to the cavity method of spin glass theory which predicts a number of phase transition thresholds. In particular the dynamical and condensation "phase transition temperatures" (equivalently test-channel noise thresholds) are computed. We observe that: (i) the dynamical temperature of the spatially coupled construction saturates towards the condensation temperature; (ii) for large degrees the condensation temperature approaches the temperature (i.e. noise level) related to the information theoretic Shannon test-channel noise parameter of rate-distortion theory. This provides heuristic insight into the excellent performance of the Belief Propagation Guided Decimation algorithm. The paper contains an introduction to the cavity method

    An Algorithmic Approach to Information and Meaning

    Get PDF
    I will survey some matters of relevance to a philosophical discussion of information, taking into account developments in algorithmic information theory (AIT). I will propose that meaning is deep in the sense of Bennett's logical depth, and that algorithmic probability may provide the stability needed for a robust algorithmic definition of meaning, one that takes into consideration the interpretation and the recipient's own knowledge encoded in the story attached to a message.Comment: preprint reviewed version closer to the version accepted by the journa

    Sobre la ecuación del calor discreta y la teoría de complejidad de Kolmogorov

    Get PDF
    In this thesis we study the heat equation on graphs from the perspective of information theory. To this end, we introduce the discrete heat equation using the probabilistic approach of random walks on graphs. Then we present a basic introduction to the subject of information theory, both from a probabilistic and an algorithmic viewpoint. Here we define the concepts of Shannon entropy, Kolmogorov complexity and mutual information; and we use codes to give an interpretation of them. As an application, we show how random walks on graphs allow us to gain information about different graph parameters. Moreover, we use the heat diffusion process on a graph as a computational mechanism to approximate the Fourier expansion of a function defined on a finite abelian group.En esta tesis estudiamos la ecuación del calor en grafos desde la perspectiva de la teoría de la información. Para ello, introducimos la ecuación del calor discreta utilizando el enfoque probabilístico de las caminatas aleatorias en grafos. Luego presentamos una introducción básica a la teoría de la información, tanto desde el punto de vista probabilístico como el algorítmico. Aquí definimos los conceptos de entropía de Shannon, complejidad de Kolmogorov e información mutua; y utilizamos códigos para dar una interpretación de los mismos. Como aplicación, mostramos cómo las caminatas aleatorias en grafos nos permiten obtener información sobre diferentes parámetros de ciertos grafos. Además, utilizamos el proceso de difusión de calor en un grafo como un mecanismo de cálculo para aproximar la expansión de Fourier de una función definida en un grupo abeliano finito.Maestrí
    corecore