46,168 research outputs found

    Information complexity is computable

    Get PDF
    The information complexity of a function ff is the minimum amount of information Alice and Bob need to exchange to compute the function ff. In this paper we provide an algorithm for approximating the information complexity of an arbitrary function ff to within any additive error α>0\alpha > 0, thus resolving an open question as to whether information complexity is computable. In the process, we give the first explicit upper bound on the rate of convergence of the information complexity of ff when restricted to bb-bit protocols to the (unrestricted) information complexity of ff.Comment: 30 page

    Parameterized Uniform Complexity in Numerics: from Smooth to Analytic, from NP-hard to Polytime

    Full text link
    The synthesis of classical Computational Complexity Theory with Recursive Analysis provides a quantitative foundation to reliable numerics. Here the operators of maximization, integration, and solving ordinary differential equations are known to map (even high-order differentiable) polynomial-time computable functions to instances which are `hard' for classical complexity classes NP, #P, and CH; but, restricted to analytic functions, map polynomial-time computable ones to polynomial-time computable ones -- non-uniformly! We investigate the uniform parameterized complexity of the above operators in the setting of Weihrauch's TTE and its second-order extension due to Kawamura&Cook (2010). That is, we explore which (both continuous and discrete, first and second order) information and parameters on some given f is sufficient to obtain similar data on Max(f) and int(f); and within what running time, in terms of these parameters and the guaranteed output precision 2^(-n). It turns out that Gevrey's hierarchy of functions climbing from analytic to smooth corresponds to the computational complexity of maximization growing from polytime to NP-hard. Proof techniques involve mainly the Theory of (discrete) Computation, Hard Analysis, and Information-Based Complexity

    Finite-State Complexity and the Size of Transducers

    Full text link
    Finite-state complexity is a variant of algorithmic information theory obtained by replacing Turing machines with finite transducers. We consider the state-size of transducers needed for minimal descriptions of arbitrary strings and, as our main result, we show that the state-size hierarchy with respect to a standard encoding is infinite. We consider also hierarchies yielded by more general computable encodings.Comment: In Proceedings DCFS 2010, arXiv:1008.127

    The "paradox" of computability and a recursive relative version of the Busy Beaver function

    Full text link
    In this article, we will show that uncomputability is a relative property not only of oracle Turing machines, but also of subrecursive classes. We will define the concept of a Turing submachine, and a recursive relative version for the Busy Beaver function which we will call Busy Beaver Plus function. Therefore, we will prove that the computable Busy Beaver Plus function defined on any Turing submachine is not computable by any program running on this submachine. We will thereby demonstrate the existence of a "paradox" of computability a la Skolem: a function is computable when "seen from the outside" the subsystem, but uncomputable when "seen from within" the same subsystem. Finally, we will raise the possibility of defining universal submachines, and a hierarchy of negative Turing degrees.Comment: 10 pages. 0 figures. Supported by the National Council for Scientific and Technological Development (CNPq), Brazil. Book chapter published in Information and Complexity, Mark Burgin and Cristian S. Calude (Editors), World Scientific Publishing, 2016, ISBN 978-981-3109-02-5, available at http://www.worldscientific.com/worldscibooks/10.1142/10017. arXiv admin note: substantial text overlap with arXiv:1612.0522

    Effective complexity of stationary process realizations

    Full text link
    The concept of effective complexity of an object as the minimal description length of its regularities has been initiated by Gell-Mann and Lloyd. The regularities are modeled by means of ensembles, that is probability distributions on finite binary strings. In our previous paper we propose a definition of effective complexity in precise terms of algorithmic information theory. Here we investigate the effective complexity of binary strings generated by stationary, in general not computable, processes. We show that under not too strong conditions long typical process realizations are effectively simple. Our results become most transparent in the context of coarse effective complexity which is a modification of the original notion of effective complexity that uses less parameters in its definition. A similar modification of the related concept of sophistication has been suggested by Antunes and Fortnow.Comment: 14 pages, no figure

    The similarity metric

    Full text link
    A new class of distances appropriate for measuring similarity relations between sequences, say one type of similarity per distance, is studied. We propose a new ``normalized information distance'', based on the noncomputable notion of Kolmogorov complexity, and show that it is in this class and it minorizes every computable distance in the class (that is, it is universal in that it discovers all computable similarities). We demonstrate that it is a metric and call it the {\em similarity metric}. This theory forms the foundation for a new practical tool. To evidence generality and robustness we give two distinctive applications in widely divergent areas using standard compression programs like gzip and GenCompress. First, we compare whole mitochondrial genomes and infer their evolutionary history. This results in a first completely automatic computed whole mitochondrial phylogeny tree. Secondly, we fully automatically compute the language tree of 52 different languages.Comment: 13 pages, LaTex, 5 figures, Part of this work appeared in Proc. 14th ACM-SIAM Symp. Discrete Algorithms, 2003. This is the final, corrected, version to appear in IEEE Trans Inform. T
    • …
    corecore