1,820 research outputs found
The Dimensions of Individual Strings and Sequences
A constructive version of Hausdorff dimension is developed using constructive
supergales, which are betting strategies that generalize the constructive
supermartingales used in the theory of individual random sequences. This
constructive dimension is used to assign every individual (infinite, binary)
sequence S a dimension, which is a real number dim(S) in the interval [0,1].
Sequences that are random (in the sense of Martin-Lof) have dimension 1, while
sequences that are decidable, \Sigma^0_1, or \Pi^0_1 have dimension 0. It is
shown that for every \Delta^0_2-computable real number \alpha in [0,1] there is
a \Delta^0_2 sequence S such that \dim(S) = \alpha.
A discrete version of constructive dimension is also developed using
termgales, which are supergale-like functions that bet on the terminations of
(finite, binary) strings as well as on their successive bits. This discrete
dimension is used to assign each individual string w a dimension, which is a
nonnegative real number dim(w). The dimension of a sequence is shown to be the
limit infimum of the dimensions of its prefixes.
The Kolmogorov complexity of a string is proven to be the product of its
length and its dimension. This gives a new characterization of algorithmic
information and a new proof of Mayordomo's recent theorem stating that the
dimension of a sequence is the limit infimum of the average Kolmogorov
complexity of its first n bits.
Every sequence that is random relative to any computable sequence of
coin-toss biases that converge to a real number \beta in (0,1) is shown to have
dimension \H(\beta), the binary entropy of \beta.Comment: 31 page
Mutual Dimension
We define the lower and upper mutual dimensions and
between any two points and in Euclidean space. Intuitively these are
the lower and upper densities of the algorithmic information shared by and
. We show that these quantities satisfy the main desiderata for a
satisfactory measure of mutual algorithmic information. Our main theorem, the
data processing inequality for mutual dimension, says that, if is computable and Lipschitz, then the inequalities
and hold for all and . We use this inequality and related
inequalities that we prove in like fashion to establish conditions under which
various classes of computable functions on Euclidean space preserve or
otherwise transform mutual dimensions between points.Comment: This article is 29 pages and has been submitted to ACM Transactions
on Computation Theory. A preliminary version of part of this material was
reported at the 2013 Symposium on Theoretical Aspects of Computer Science in
Kiel, German
Finite-State Dimension and Real Arithmetic
We use entropy rates and Schur concavity to prove that, for every integer k
>= 2, every nonzero rational number q, and every real number alpha, the base-k
expansions of alpha, q+alpha, and q*alpha all have the same finite-state
dimension and the same finite-state strong dimension. This extends, and gives a
new proof of, Wall's 1949 theorem stating that the sum or product of a nonzero
rational number and a Borel normal number is always Borel normal.Comment: 15 page
Dimensions of Copeland-Erdos Sequences
The base- {\em Copeland-Erd\"os sequence} given by an infinite set of
positive integers is the infinite sequence \CE_k(A) formed by concatenating
the base- representations of the elements of in numerical order. This
paper concerns the following four quantities.
The {\em finite-state dimension} \dimfs (\CE_k(A)), a finite-state version
of classical Hausdorff dimension introduced in 2001.
The {\em finite-state strong dimension} \Dimfs(\CE_k(A)), a finite-state
version of classical packing dimension introduced in 2004. This is a dual of
\dimfs(\CE_k(A)) satisfying \Dimfs(\CE_k(A)) \geq \dimfs(\CE_k(A)).
The {\em zeta-dimension} \Dimzeta(A), a kind of discrete fractal dimension
discovered many times over the past few decades.
The {\em lower zeta-dimension} \dimzeta(A), a dual of \Dimzeta(A)
satisfying \dimzeta(A)\leq \Dimzeta(A).
We prove the following.
\dimfs(\CE_k(A))\geq \dimzeta(A). This extends the 1946 proof by Copeland
and Erd\"os that the sequence \CE_k(\mathrm{PRIMES}) is Borel normal.
\Dimfs(\CE_k(A))\geq \Dimzeta(A).
These bounds are tight in the strong sense that these four quantities can
have (simultaneously) any four values in satisfying the four
above-mentioned inequalities.Comment: 19 page
Algorithmic Information, Plane Kakeya Sets, and Conditional Dimension
We formulate the conditional Kolmogorov complexity of x given y at precision r, where x and y are points in Euclidean spaces and r is a natural number. We demonstrate the utility of this notion in two ways.
1. We prove a point-to-set principle that enables one to use the (relativized, constructive) dimension of a single point in a set E in a Euclidean space to establish a lower bound on the (classical) Hausdorff dimension of E. We then use this principle, together with conditional Kolmogorov complexity in Euclidean spaces, to give a new proof of the known, two-dimensional case of the Kakeya conjecture. This theorem of geometric measure theory, proved by Davies in 1971, says that every plane set containing a unit line segment in every direction has Hausdorff dimension 2.
2. We use conditional Kolmogorov complexity in Euclidean spaces to develop the lower and upper conditional dimensions dim(x|y) and Dim(x|y) of x given y, where x and y are points in Euclidean spaces. Intuitively these are the lower and upper asymptotic algorithmic information densities of x conditioned on the information in y. We prove that these conditional dimensions are robust and that they have the correct information-theoretic relationships with the well-studied dimensions dim(x) and Dim(x) and the mutual dimensions mdim(x:y) and Mdim(x:y)
- …