45 research outputs found

    Some Inequalities for the Relative Entropy and Applications

    Get PDF
    Some new inequalities for the relative entropy and applications are given

    Kraft's number and ideal word packing

    Get PDF
    N. M. Dragomir, S. S. Dragomir, C. E. M. Pearce and J. Sund

    Shannon Information and Kolmogorov Complexity

    Full text link
    We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and where they are fundamentally different. We discuss and relate the basic notions of both theories: Shannon entropy versus Kolmogorov complexity, the relation of both to universal coding, Shannon mutual information versus Kolmogorov (`algorithmic') mutual information, probabilistic sufficient statistic versus algorithmic sufficient statistic (related to lossy compression in the Shannon theory versus meaningful information in the Kolmogorov theory), and rate distortion theory versus Kolmogorov's structure function. Part of the material has appeared in print before, scattered through various publications, but this is the first comprehensive systematic comparison. The last mentioned relations are new.Comment: Survey, LaTeX 54 pages, 3 figures, Submitted to IEEE Trans Information Theor

    Lower bounds on the redundancy in computations from random oracles via betting strategies with restricted wagers

    Get PDF
    The Kučera–GĂĄcs theorem is a landmark result in algorithmic randomness asserting that every real is computable from a Martin-Löf random real. If the computation of the first n bits of a sequence requires n+h(n) bits of the random oracle, then h is the redundancy of the computation. Kučera implicitly achieved redundancy nlog⁥n while GĂĄcs used a more elaborate coding procedure which achieves redundancy View the MathML source. A similar bound is implicit in the later proof by Merkle and Mihailović. In this paper we obtain optimal strict lower bounds on the redundancy in computations from Martin-Löf random oracles. We show that any nondecreasing computable function g such that ∑n2−g(n)=∞ is not a general upper bound on the redundancy in computations from Martin-Löf random oracles. In fact, there exists a real X such that the redundancy g of any computation of X from a Martin-Löf random oracle satisfies ∑n2−g(n)<∞. Moreover, the class of such reals is comeager and includes a View the MathML source real as well as all weakly 2-generic reals. On the other hand, it has been recently shown that any real is computable from a Martin-Löf random oracle with redundancy g, provided that g is a computable nondecreasing function such that ∑n2−g(n)<∞. Hence our lower bound is optimal, and excludes many slow growing functions such as log⁥n from bounding the redundancy in computations from random oracles for a large class of reals. Our results are obtained as an application of a theory of effective betting strategies with restricted wagers which we develop

    Algorithmic statistics: forty years later

    Full text link
    Algorithmic statistics has two different (and almost orthogonal) motivations. From the philosophical point of view, it tries to formalize how the statistics works and why some statistical models are better than others. After this notion of a "good model" is introduced, a natural question arises: it is possible that for some piece of data there is no good model? If yes, how often these bad ("non-stochastic") data appear "in real life"? Another, more technical motivation comes from algorithmic information theory. In this theory a notion of complexity of a finite object (=amount of information in this object) is introduced; it assigns to every object some number, called its algorithmic complexity (or Kolmogorov complexity). Algorithmic statistic provides a more fine-grained classification: for each finite object some curve is defined that characterizes its behavior. It turns out that several different definitions give (approximately) the same curve. In this survey we try to provide an exposition of the main results in the field (including full proofs for the most important ones), as well as some historical comments. We assume that the reader is familiar with the main notions of algorithmic information (Kolmogorov complexity) theory.Comment: Missing proofs adde

    Kolmogorov complexity

    Get PDF
    In dieser Dissertation werden neue Ergebnisse ĂŒber KolmogorovkomplexitĂ€t diskutiert. Ihr erster Teil konzentriert sich auf das Studium von KolmogorovkomplexitĂ€t ohne Zeitschranken. Hier beschĂ€ftigen wir uns mit dem Konzept nicht-monotoner ZufĂ€lligkeit, d.h. ZufĂ€lligkeit, die von Martingalen charakterisiert wird, die in nicht-monotoner Reihenfolge wetten dĂŒrfen. Wir werden in diesem Zusammenhang eine Reihe von ZufĂ€lligkeitsklassen einfĂŒhren, und diese dann von einander separieren. Wir prĂ€sentieren auß erdem einen systematischen ĂŒberblick ĂŒber verschiedene Traceability-Begriffe und charakterisieren diese durch (Auto-)KomplexitĂ€tsbegriffe. Traceabilities sind eine Gruppe von Begriffen, die ausdrĂŒcken, dass eine Menge beinahe berechenbar ist. Der zweite Teil dieses Dokuments beschĂ€ftigt sich mit dem Thema zeitbeschrĂ€nkter KolmogorovkomplexitĂ€t. ZunĂ€chst untersuchen wir den Unterschied zwischen zwei Arten, ein Wort zu beschreiben: Die KomplexitĂ€t, es genau genug zu beschreiben, damit es von anderen Wörter unterschieden werden kann; sowie die KomplexitĂ€t, es genau genug zu beschreiben, damit das Wort aus der Beschreibung tatsĂ€chlich generiert werden kann. Diese Unterscheidung ist im Falle zeitunbeschrĂ€nkter KolmogorovkomplexitĂ€t nicht von Bedeutung; sobald wir jedoch Zeitschranken einfĂŒhren, wird sie essentiell. Als nĂ€chstes fĂŒhren wir den Begriff der Tiefe ein und beweisen ein ihn betreffendes Dichotomieresultat, das in seiner Struktur an Kummers bekanntes Gap-Theorem erinnert. Zu guter Letzt betrachten wir den wichtigen Begriff der Solovayfunktionen. Hierbei handelt es sich um berechenbare obere Schranken der KolmogorovkomplexitĂ€t, die unendlich oft scharf sind. Wir benutzen sie, um in einem gewissen Zusammenhang Martin-Löf-ZufĂ€lligkeit zu charakterisieren, und um eine Charakterisierung von Jump-Traceability anzugeben
    corecore