48 research outputs found

    Average-Case Hardness of Proving Tautologies and Theorems

    Full text link
    We consolidate two widely believed conjectures about tautologies -- no optimal proof system exists, and most require superpolynomial size proofs in any system -- into a pp-isomorphism-invariant condition satisfied by all paddable coNP\textbf{coNP}-complete languages or none. The condition is: for any Turing machine (TM) MM accepting the language, P\textbf{P}-uniform input families requiring superpolynomial time by MM exist (equivalent to the first conjecture) and appear with positive upper density in an enumeration of input families (implies the second). In that case, no such language is easy on average (in AvgP\textbf{AvgP}) for a distribution applying non-negligible weight to the hard families. The hardness of proving tautologies and theorems is likely related. Motivated by the fact that arithmetic sentences encoding "string xx is Kolmogorov random" are true but unprovable with positive density in a finitely axiomatized theory T\mathcal{T} (Calude and J{\"u}rgensen), we conjecture that any propositional proof system requires superpolynomial size proofs for a dense set of P\textbf{P}-uniform families of tautologies encoding "there is no T\mathcal{T} proof of size ≀t\leq t showing that string xx is Kolmogorov random". This implies the above condition. The conjecture suggests that there is no optimal proof system because undecidable theories help prove tautologies and do so more efficiently as axioms are added, and that constructing hard tautologies seems difficult because it is impossible to construct Kolmogorov random strings. Similar conjectures that computational blind spots are manifestations of noncomputability would resolve other open problems

    Information Distance: New Developments

    Full text link
    In pattern recognition, learning, and data mining one obtains information from information-carrying objects. This involves an objective definition of the information in a single object, the information to go from one object to another object in a pair of objects, the information to go from one object to any other object in a multiple of objects, and the shared information between objects. This is called "information distance." We survey a selection of new developments in information distance.Comment: 4 pages, Latex; Series of Publications C, Report C-2011-45, Department of Computer Science, University of Helsinki, pp. 71-7

    Is Consciousness Computable? Quantifying Integrated Information Using Algorithmic Information Theory

    Get PDF
    In this article we review Tononi's (2008) theory of consciousness as integrated information. We argue that previous formalizations of integrated information (e.g. Griffith, 2014) depend on information loss. Since lossy integration would necessitate continuous damage to existing memories, we propose it is more natural to frame consciousness as a lossless integrative process and provide a formalization of this idea using algorithmic information theory. We prove that complete lossless integration requires noncomputable functions. This result implies that if unitary consciousness exists, it cannot be modelled computationally.Comment: Maguire, P., Moser, P., Maguire, R. & Griffith, V. (2014). Is consciousness computable? Quantifying integrated information using algorithmic information theory. In P. Bello, M. Guarini, M. McShane, & B. Scassellati (Eds.), Proceedings of the 36th Annual Conference of the Cognitive Science Society. Austin, TX: Cognitive Science Societ

    Noncomputability, Unpredictability, Undecidability, and Unsolvability in Economic and Finance Theories

    Get PDF
    We outline, briefly, the role that issues of the nexus between noncomputability and unpredictability, on the one hand, and between undecidability and unsolvability, on the other hand, have played in Computable Economics (CE). The mathematical underpinnings of CE are provided by (classical) recursion theory, varieties of computable and constructive analysis and aspects of combinatorial optimization. The inspiration for this outline was provided by Professor Graça’s thought-provoking recent article

    Lower bounds on the redundancy in computations from random oracles via betting strategies with restricted wagers

    Get PDF
    The Kučera–GĂĄcs theorem is a landmark result in algorithmic randomness asserting that every real is computable from a Martin-Löf random real. If the computation of the first n bits of a sequence requires n+h(n) bits of the random oracle, then h is the redundancy of the computation. Kučera implicitly achieved redundancy nlog⁥n while GĂĄcs used a more elaborate coding procedure which achieves redundancy View the MathML source. A similar bound is implicit in the later proof by Merkle and Mihailović. In this paper we obtain optimal strict lower bounds on the redundancy in computations from Martin-Löf random oracles. We show that any nondecreasing computable function g such that ∑n2−g(n)=∞ is not a general upper bound on the redundancy in computations from Martin-Löf random oracles. In fact, there exists a real X such that the redundancy g of any computation of X from a Martin-Löf random oracle satisfies ∑n2−g(n)<∞. Moreover, the class of such reals is comeager and includes a View the MathML source real as well as all weakly 2-generic reals. On the other hand, it has been recently shown that any real is computable from a Martin-Löf random oracle with redundancy g, provided that g is a computable nondecreasing function such that ∑n2−g(n)<∞. Hence our lower bound is optimal, and excludes many slow growing functions such as log⁥n from bounding the redundancy in computations from random oracles for a large class of reals. Our results are obtained as an application of a theory of effective betting strategies with restricted wagers which we develop

    Shannon Information and Kolmogorov Complexity

    Full text link
    We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and where they are fundamentally different. We discuss and relate the basic notions of both theories: Shannon entropy versus Kolmogorov complexity, the relation of both to universal coding, Shannon mutual information versus Kolmogorov (`algorithmic') mutual information, probabilistic sufficient statistic versus algorithmic sufficient statistic (related to lossy compression in the Shannon theory versus meaningful information in the Kolmogorov theory), and rate distortion theory versus Kolmogorov's structure function. Part of the material has appeared in print before, scattered through various publications, but this is the first comprehensive systematic comparison. The last mentioned relations are new.Comment: Survey, LaTeX 54 pages, 3 figures, Submitted to IEEE Trans Information Theor

    Noncomputability Arising In Dynamical Triangulation Model Of Four-Dimensional Quantum Gravity

    Full text link
    Computations in Dynamical Triangulation Models of Four-Dimensional Quantum Gravity involve weighted averaging over sets of all distinct triangulations of compact four-dimensional manifolds. In order to be able to perform such computations one needs an algorithm which for any given NN and a given compact four-dimensional manifold MM constructs all possible triangulations of MM with ≀N\leq N simplices. Our first result is that such algorithm does not exist. Then we discuss recursion-theoretic limitations of any algorithm designed to perform approximate calculations of sums over all possible triangulations of a compact four-dimensional manifold.Comment: 8 Pages, LaTex, PUPT-132

    The similarity metric

    Full text link
    A new class of distances appropriate for measuring similarity relations between sequences, say one type of similarity per distance, is studied. We propose a new ``normalized information distance'', based on the noncomputable notion of Kolmogorov complexity, and show that it is in this class and it minorizes every computable distance in the class (that is, it is universal in that it discovers all computable similarities). We demonstrate that it is a metric and call it the {\em similarity metric}. This theory forms the foundation for a new practical tool. To evidence generality and robustness we give two distinctive applications in widely divergent areas using standard compression programs like gzip and GenCompress. First, we compare whole mitochondrial genomes and infer their evolutionary history. This results in a first completely automatic computed whole mitochondrial phylogeny tree. Secondly, we fully automatically compute the language tree of 52 different languages.Comment: 13 pages, LaTex, 5 figures, Part of this work appeared in Proc. 14th ACM-SIAM Symp. Discrete Algorithms, 2003. This is the final, corrected, version to appear in IEEE Trans Inform. T
    corecore