10,894 research outputs found
Limit theorems for von Mises statistics of a measure preserving transformation
For a measure preserving transformation of a probability space
we investigate almost sure and distributional convergence
of random variables of the form where (called the \emph{kernel})
is a function from to and are appropriate normalizing
constants. We observe that the above random variables are well defined and
belong to provided that the kernel is chosen from the projective
tensor product with We establish a form of the individual ergodic theorem for such
sequences. Next, we give a martingale approximation argument to derive a
central limit theorem in the non-degenerate case (in the sense of the classical
Hoeffding's decomposition). Furthermore, for and a wide class of
canonical kernels we also show that the convergence holds in distribution
towards a quadratic form in independent
standard Gaussian variables . Our results on the
distributional convergence use a --\,invariant filtration as a prerequisite
and are derived from uni- and multivariate martingale approximations
Stochastic expansions using continuous dictionaries: L\'{e}vy adaptive regression kernels
This article describes a new class of prior distributions for nonparametric
function estimation. The unknown function is modeled as a limit of weighted
sums of kernels or generator functions indexed by continuous parameters that
control local and global features such as their translation, dilation,
modulation and shape. L\'{e}vy random fields and their stochastic integrals are
employed to induce prior distributions for the unknown functions or,
equivalently, for the number of kernels and for the parameters governing their
features. Scaling, shape, and other features of the generating functions are
location-specific to allow quite different function properties in different
parts of the space, as with wavelet bases and other methods employing
overcomplete dictionaries. We provide conditions under which the stochastic
expansions converge in specified Besov or Sobolev norms. Under a Gaussian error
model, this may be viewed as a sparse regression problem, with regularization
induced via the L\'{e}vy random field prior distribution. Posterior inference
for the unknown functions is based on a reversible jump Markov chain Monte
Carlo algorithm. We compare the L\'{e}vy Adaptive Regression Kernel (LARK)
method to wavelet-based methods using some of the standard test functions, and
illustrate its flexibility and adaptability in nonstationary applications.Comment: Published in at http://dx.doi.org/10.1214/11-AOS889 the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org
On the uniform convergence of random series in Skorohod space and representations of c\`{a}dl\`{a}g infinitely divisible processes
Let be independent random elements in the Skorohod space
of c\`{a}dl\`{a}g functions taking values in a separable Banach space . Let
. We show that if converges in finite dimensional
distributions to a c\`{a}dl\`{a}g process, then converges a.s.
pathwise uniformly over , for some . This result
extends the It\^{o}-Nisio theorem to the space , which is
surprisingly lacking in the literature even for . The main difficulties of
dealing with in this context are its nonseparability under the
uniform norm and the discontinuity of addition under Skorohod's -topology.
We use this result to prove the uniform convergence of various series
representations of c\`{a}dl\`{a}g infinitely divisible processes. As a
consequence, we obtain explicit representations of the jump process, and of
related path functionals, in a general non-Markovian setting. Finally, we
illustrate our results on an example of stable processes. To this aim we obtain
new criteria for such processes to have c\`{a}dl\`{a}g modifications, which may
also be of independent interest.Comment: Published in at http://dx.doi.org/10.1214/12-AOP783 the Annals of
Probability (http://www.imstat.org/aop/) by the Institute of Mathematical
Statistics (http://www.imstat.org
- β¦