432 research outputs found

    Computational limits to nonparametric estimation for ergodic processes

    Full text link
    A new negative result for nonparametric estimation of binary ergodic processes is shown. I The problem of estimation of distribution with any degree of accuracy is studied. Then it is shown that for any countable class of estimators there is a zero-entropy binary ergodic process that is inconsistent with the class of estimators. Our result is different from other negative results for universal forecasting scheme of ergodic processes.Comment: submitted to IEEE trans I

    Empirical processes, typical sequences and coordinated actions in standard Borel spaces

    Full text link
    This paper proposes a new notion of typical sequences on a wide class of abstract alphabets (so-called standard Borel spaces), which is based on approximations of memoryless sources by empirical distributions uniformly over a class of measurable "test functions." In the finite-alphabet case, we can take all uniformly bounded functions and recover the usual notion of strong typicality (or typicality under the total variation distance). For a general alphabet, however, this function class turns out to be too large, and must be restricted. With this in mind, we define typicality with respect to any Glivenko-Cantelli function class (i.e., a function class that admits a Uniform Law of Large Numbers) and demonstrate its power by giving simple derivations of the fundamental limits on the achievable rates in several source coding scenarios, in which the relevant operational criteria pertain to reproducing empirical averages of a general-alphabet stationary memoryless source with respect to a suitable function class.Comment: 14 pages, 3 pdf figures; accepted to IEEE Transactions on Information Theor

    Fundamental Limitations of Disturbance Attenuation in the Presence of Side Information

    Get PDF
    In this paper, we study fundamental limitations of disturbance attenuation of feedback systems, under the assumption that the controller has a finite horizon preview of the disturbance. In contrast with prior work, we extend Bode's integral equation for the case where the preview is made available to the controller via a general, finite capacity, communication system. Under asymptotic stationarity assumptions, our results show that the new fundamental limitation differs from Bode's only by a constant, which quantifies the information rate through the communication system. In the absence of asymptotic stationarity, we derive a universal lower bound which uses Shannon's entropy rate as a measure of performance. By means of a case-study, we show that our main bounds may be achieved

    Universal Coding and Prediction on Martin-L\"of Random Points

    Full text link
    We perform an effectivization of classical results concerning universal coding and prediction for stationary ergodic processes over an arbitrary finite alphabet. That is, we lift the well-known almost sure statements to statements about Martin-L\"of random sequences. Most of this work is quite mechanical but, by the way, we complete a result of Ryabko from 2008 by showing that each universal probability measure in the sense of universal coding induces a universal predictor in the prequential sense. Surprisingly, the effectivization of this implication holds true provided the universal measure does not ascribe too low conditional probabilities to individual symbols. As an example, we show that the Prediction by Partial Matching (PPM) measure satisfies this requirement. In the almost sure setting, the requirement is superfluous.Comment: 12 page

    Bayesian Entropy Estimation for Countable Discrete Distributions

    Full text link
    We consider the problem of estimating Shannon's entropy HH from discrete data, in cases where the number of possible symbols is unknown or even countably infinite. The Pitman-Yor process, a generalization of Dirichlet process, provides a tractable prior distribution over the space of countably infinite discrete distributions, and has found major applications in Bayesian non-parametric statistics and machine learning. Here we show that it also provides a natural family of priors for Bayesian entropy estimation, due to the fact that moments of the induced posterior distribution over HH can be computed analytically. We derive formulas for the posterior mean (Bayes' least squares estimate) and variance under Dirichlet and Pitman-Yor process priors. Moreover, we show that a fixed Dirichlet or Pitman-Yor process prior implies a narrow prior distribution over HH, meaning the prior strongly determines the entropy estimate in the under-sampled regime. We derive a family of continuous mixing measures such that the resulting mixture of Pitman-Yor processes produces an approximately flat prior over HH. We show that the resulting Pitman-Yor Mixture (PYM) entropy estimator is consistent for a large class of distributions. We explore the theoretical properties of the resulting estimator, and show that it performs well both in simulation and in application to real data.Comment: 38 pages LaTeX. Revised and resubmitted to JML
    corecore