13,547 research outputs found
The dimension of ergodic random sequences
Let \mu be a computable ergodic shift-invariant measure over the Cantor
space. Providing a constructive proof of Shannon-McMillan-Breiman theorem,
V'yugin proved that if a sequence x is Martin-L\"of random w.r.t. \mu then the
strong effective dimension Dim(x) of x equals the entropy of \mu. Whether its
effective dimension dim(x) also equals the entropy was left as an problem
question. In this paper we settle this problem, providing a positive answer. A
key step in the proof consists in extending recent results on Birkhoff's
ergodic theorem for Martin-L\"of random sequences
Predictive PAC Learning and Process Decompositions
We informally call a stochastic process learnable if it admits a
generalization error approaching zero in probability for any concept class with
finite VC-dimension (IID processes are the simplest example). A mixture of
learnable processes need not be learnable itself, and certainly its
generalization error need not decay at the same rate. In this paper, we argue
that it is natural in predictive PAC to condition not on the past observations
but on the mixture component of the sample path. This definition not only
matches what a realistic learner might demand, but also allows us to sidestep
several otherwise grave problems in learning from dependent data. In
particular, we give a novel PAC generalization bound for mixtures of learnable
processes with a generalization error that is not worse than that of each
mixture component. We also provide a characterization of mixtures of absolutely
regular (-mixing) processes, of independent probability-theoretic
interest.Comment: 9 pages, accepted in NIPS 201
Entropy of convolutions on the circle
Given ergodic p-invariant measures {\mu_i} on the 1-torus T=R/Z, we give a
sharp condition on their entropies, guaranteeing that the entropy of the
convolution \muon converges to \log p. We also prove a variant of this result
for joinings of full entropy on \T^\N. In conjunction with a method of Host,
this yields the following. Denote \sig_q(x) = qx\pmod{1}. Then for every
p-invariant ergodic \mu with positive entropy,
\frac{1}{N}\sum_{n=0}^{N-1}\sig_{c_n}\mu converges weak^* to Lebesgue measure
as N \goesto \infty, under a certain mild combinatorial condition on {c_k}.
(For instance, the condition is satisfied if p=10 and c_k=2^k+6^k or
c_k=2^{2^k}.) This extends a result of Johnson and Rudolph, who considered the
sequence c_k = q^k when p and q are multiplicatively independent.
We also obtain the following corollary concerning Hausdorff dimension of sum
sets: For any sequence {S_i} of p-invariant closed subsets of T, if \sum
\dim_H(S_i) / |\log\dim_H(S_i)| = \infty, then \dim_H(S_1 + \cdots + S_n)
\goesto 1.Comment: 34 pages, published versio
- …