94 research outputs found

    Algorithmic statistics revisited

    Full text link
    The mission of statistics is to provide adequate statistical hypotheses (models) for observed data. But what is an "adequate" model? To answer this question, one needs to use the notions of algorithmic information theory. It turns out that for every data string xx one can naturally define "stochasticity profile", a curve that represents a trade-off between complexity of a model and its adequacy. This curve has four different equivalent definitions in terms of (1)~randomness deficiency, (2)~minimal description length, (3)~position in the lists of simple strings and (4)~Kolmogorov complexity with decompression time bounded by busy beaver function. We present a survey of the corresponding definitions and results relating them to each other

    Algorithmic Information, Plane Kakeya Sets, and Conditional Dimension

    Get PDF
    We formulate the conditional Kolmogorov complexity of x given y at precision r, where x and y are points in Euclidean spaces and r is a natural number. We demonstrate the utility of this notion in two ways. 1. We prove a point-to-set principle that enables one to use the (relativized, constructive) dimension of a single point in a set E in a Euclidean space to establish a lower bound on the (classical) Hausdorff dimension of E. We then use this principle, together with conditional Kolmogorov complexity in Euclidean spaces, to give a new proof of the known, two-dimensional case of the Kakeya conjecture. This theorem of geometric measure theory, proved by Davies in 1971, says that every plane set containing a unit line segment in every direction has Hausdorff dimension 2. 2. We use conditional Kolmogorov complexity in Euclidean spaces to develop the lower and upper conditional dimensions dim(x|y) and Dim(x|y) of x given y, where x and y are points in Euclidean spaces. Intuitively these are the lower and upper asymptotic algorithmic information densities of x conditioned on the information in y. We prove that these conditional dimensions are robust and that they have the correct information-theoretic relationships with the well-studied dimensions dim(x) and Dim(x) and the mutual dimensions mdim(x:y) and Mdim(x:y)

    An Operational Characterization of Mutual Information in Algorithmic Information Theory

    Get PDF
    We show that the mutual information, in the sense of Kolmogorov complexity, of any pair of strings x and y is equal, up to logarithmic precision, to the length of the longest shared secret key that two parties, one having x and the complexity profile of the pair and the other one having y and the complexity profile of the pair, can establish via a probabilistic protocol with interaction on a public channel. For l > 2, the longest shared secret that can be established from a tuple of strings (x_1, . . .x_l) by l parties, each one having one component of the tuple and the complexity profile of the tuple, is equal, up to logarithmic precision, to the complexity of the tuple minus the minimum communication necessary for distributing the tuple to all parties. We establish the communication complexity of secret key agreement protocols that produce a secret key of maximal length, for protocols with public randomness. We also show that if the communication complexity drops below the established threshold then only very short secret keys can be obtained
    • …
    corecore