145 research outputs found
Inequalities for space-bounded Kolmogorov complexity
There is a parallelism between Shannon information theory and algorithmic
information theory. In particular, the same linear inequalities are true for
Shannon entropies of tuples of random variables and Kolmogorov complexities of
tuples of strings (Hammer et al., 1997), as well as for sizes of subgroups and
projections of sets (Chan, Yeung, Romashchenko, Shen, Vereshchagin,
1998--2002). This parallelism started with the Kolmogorov-Levin formula (1968)
for the complexity of pairs of strings with logarithmic precision. Longpr\'e
(1986) proved a version of this formula for space-bounded complexities.
In this paper we prove an improved version of Longpr\'e's result with a
tighter space bound, using Sipser's trick (1980). Then, using this space bound,
we show that every linear inequality that is true for complexities or
entropies, is also true for space-bounded Kolmogorov complexities with a
polynomial space overhead.Comment: [Extended version, with full proofs added; some corrections are made
Inequalities for entropies and dimensions
We show that linear inequalities for entropies have a natural geometric
interpretation in terms of Hausdorff and packing dimensions, using the
point-to-set principle and known results about inequalities for complexities,
entropies and the sizes of subgroups.Comment: 11 pages. Accepted by CiE 2023 (Computability in Europe) conferenc
Conditional Information Inequalities for Entropic and Almost Entropic Points
We study conditional linear information inequalities, i.e., linear
inequalities for Shannon entropy that hold for distributions whose entropies
meet some linear constraints. We prove that some conditional information
inequalities cannot be extended to any unconditional linear inequalities. Some
of these conditional inequalities hold for almost entropic points, while others
do not. We also discuss some counterparts of conditional information
inequalities for Kolmogorov complexity.Comment: Submitted to the IEEE Transactions on Information Theor
Random Noise Increases Kolmogorov Complexity and Hausdorff Dimension
International audienceConsider a bit string x of length n and Kolmogorov complexity αn (for some α < 1). It is always possible to increase the complexity of x by changing a small fraction of bits in x [2]. What happens with the complexity of x when we randomly change each bit independently with some probability τ ? We prove that a linear increase in complexity happens with high probability, but this increase is smaller than in the case of arbitrary change considered in [2]. The amount of the increase depends on x (strings of the same complexity could behave differently). We give exact lower and upper bounds for this increase (with o(n) precision). The same technique is used to prove the results about the (effective Hausdorff) dimension of infinite sequences. We show that random change increases the dimension with probability 1, and provide an optimal lower bound for the dimension of the changed sequence. We also improve a result from [5] and show that for every sequence ω of dimension α there exists a strongly α-random sequence ω such that the Besicovitch distance between ω and ω is 0. The proofs use the combinatorial and probabilistic reformulations of complexity statements and the technique that goes back to Ahlswede, Gács and Körner [1]
- …