12,318 research outputs found

    Sample path properties of the stochastic flows

    Get PDF
    We consider a stochastic flow driven by a finite-dimensional Brownian motion. We show that almost every realization of such a flow exhibits strong statistical properties such as the exponential convergence of an initial measure to the equilibrium state and the central limit theorem. The proof uses new estimates of the mixing rates of the multi-point motion

    Concentration of Measure Inequalities in Information Theory, Communications and Coding (Second Edition)

    Full text link
    During the last two decades, concentration inequalities have been the subject of exciting developments in various areas, including convex geometry, functional analysis, statistical physics, high-dimensional statistics, pure and applied probability theory, information theory, theoretical computer science, and learning theory. This monograph focuses on some of the key modern mathematical tools that are used for the derivation of concentration inequalities, on their links to information theory, and on their various applications to communications and coding. In addition to being a survey, this monograph also includes various new recent results derived by the authors. The first part of the monograph introduces classical concentration inequalities for martingales, as well as some recent refinements and extensions. The power and versatility of the martingale approach is exemplified in the context of codes defined on graphs and iterative decoding algorithms, as well as codes for wireless communication. The second part of the monograph introduces the entropy method, an information-theoretic technique for deriving concentration inequalities. The basic ingredients of the entropy method are discussed first in the context of logarithmic Sobolev inequalities, which underlie the so-called functional approach to concentration of measure, and then from a complementary information-theoretic viewpoint based on transportation-cost inequalities and probability in metric spaces. Some representative results on concentration for dependent random variables are briefly summarized, with emphasis on their connections to the entropy method. Finally, we discuss several applications of the entropy method to problems in communications and coding, including strong converses, empirical distributions of good channel codes, and an information-theoretic converse for concentration of measure.Comment: Foundations and Trends in Communications and Information Theory, vol. 10, no 1-2, pp. 1-248, 2013. Second edition was published in October 2014. ISBN to printed book: 978-1-60198-906-

    Local limit theorem in deterministic systems

    Get PDF
    We show that for every ergodic and aperiodic probability preserving system, there exists a Z\mathbb{Z} valued, square integrable function ff such that the partial sums process of the time series {f∘Ti}i=0∞\left\{f\circ T^i\right\}_{i=0}^\infty satisfies the lattice local limit theorem.Comment: 17 page

    Gaussian limit for determinantal random point fields

    Full text link
    We prove that under fairly general conditions properly rescaled determinantal random point field converges to a generalized Gaussian random process.Comment: This is the revised version accepted for publication in the Annals of Probability. The results of Theorems 1 and 2 are extended, minor misprints are correcte

    Towards zero variance estimators for rare event probabilities

    Full text link
    Improving Importance Sampling estimators for rare event probabilities requires sharp approximations of conditional densities. This is achieved for events E_{n}:=(f(X_{1})+...+f(X_{n}))\inA_{n} where the summands are i.i.d. and E_{n} is a large or moderate deviation event. The approximation of the conditional density of the real r.v's X_{i} 's, for 1\leqi\leqk_{n} with repect to E_{n} on long runs, when k_{n}/n\to1, is handled. The maximal value of k compatible with a given accuracy is discussed; algorithms and simulated results are presented
    • …
    corecore