1,658,181 research outputs found

    Strong path convergence from Loewner driving function convergence

    Full text link
    We show that, under mild assumptions on the limiting curve, a sequence of simple chordal planar curves converges uniformly whenever certain Loewner driving functions converge. We extend this result to random curves. The random version applies in particular to random lattice paths that have chordal SLEκ\mathrm {SLE}_{\kappa} as a scaling limit, with κ<8\kappa <8 (nonspace-filling). Existing SLEκ\mathrm {SLE}_{\kappa} convergence proofs often begin by showing that the Loewner driving functions of these paths (viewed from ∞\infty) converge to Brownian motion. Unfortunately, this is not sufficient, and additional arguments are required to complete the proofs. We show that driving function convergence is sufficient if it can be established for both parametrization directions and a generic observation point.Comment: Published in at http://dx.doi.org/10.1214/10-AOP627 the Annals of Probability (http://www.imstat.org/aop/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Modularity of Convergence and Strong Convergence in Infinitary Rewriting

    Full text link
    Properties of Term Rewriting Systems are called modular iff they are preserved under (and reflected by) disjoint union, i.e. when combining two Term Rewriting Systems with disjoint signatures. Convergence is the property of Infinitary Term Rewriting Systems that all reduction sequences converge to a limit. Strong Convergence requires in addition that redex positions in a reduction sequence move arbitrarily deep. In this paper it is shown that both Convergence and Strong Convergence are modular properties of non-collapsing Infinitary Term Rewriting Systems, provided (for convergence) that the term metrics are granular. This generalises known modularity results beyond metric \infty

    On Convergence Properties of Shannon Entropy

    Full text link
    Convergence properties of Shannon Entropy are studied. In the differential setting, it is shown that weak convergence of probability measures, or convergence in distribution, is not enough for convergence of the associated differential entropies. A general result for the desired differential entropy convergence is provided, taking into account both compactly and uncompactly supported densities. Convergence of differential entropy is also characterized in terms of the Kullback-Liebler discriminant for densities with fairly general supports, and it is shown that convergence in variation of probability measures guarantees such convergence under an appropriate boundedness condition on the densities involved. Results for the discrete setting are also provided, allowing for infinitely supported probability measures, by taking advantage of the equivalence between weak convergence and convergence in variation in this setting.Comment: Submitted to IEEE Transactions on Information Theor

    Rainwater-Simons-type convergence theorems for generalized convergence methods

    Get PDF
    We extend the well-known Rainwater-Simons convergence theorem to various generalized convergence methods such as strong matrix summability, statistical convergence and almost convergence. In fact we prove these theorems not only for boundaries but for the more general notion of (I)-generating sets introduced by Fonf and Lindenstrauss.Comment: 10 pages, version 2, references added, one remark added, revised version accepted for publication in Acta et Commentationes Universitatis Tartuensis de Mathematic
    • …
    corecore