3,739 research outputs found

    On the method of typical bounded differences

    Full text link
    Concentration inequalities are fundamental tools in probabilistic combinatorics and theoretical computer science for proving that random functions are near their means. Of particular importance is the case where f(X) is a function of independent random variables X=(X_1, ..., X_n). Here the well known bounded differences inequality (also called McDiarmid's or Hoeffding-Azuma inequality) establishes sharp concentration if the function f does not depend too much on any of the variables. One attractive feature is that it relies on a very simple Lipschitz condition (L): it suffices to show that |f(X)-f(X')| \leq c_k whenever X,X' differ only in X_k. While this is easy to check, the main disadvantage is that it considers worst-case changes c_k, which often makes the resulting bounds too weak to be useful. In this paper we prove a variant of the bounded differences inequality which can be used to establish concentration of functions f(X) where (i) the typical changes are small although (ii) the worst case changes might be very large. One key aspect of this inequality is that it relies on a simple condition that (a) is easy to check and (b) coincides with heuristic considerations why concentration should hold. Indeed, given an event \Gamma that holds with very high probability, we essentially relax the Lipschitz condition (L) to situations where \Gamma occurs. The point is that the resulting typical changes c_k are often much smaller than the worst case ones. To illustrate its application we consider the reverse H-free process, where H is 2-balanced. We prove that the final number of edges in this process is concentrated, and also determine its likely value up to constant factors. This answers a question of Bollob\'as and Erd\H{o}s.Comment: 25 page

    The Janson inequalities for general up-sets

    Full text link
    Janson and Janson, Luczak and Rucinski proved several inequalities for the lower tail of the distribution of the number of events that hold, when all the events are up-sets (increasing events) of a special form - each event is the intersection of some subset of a single set of independent events (i.e., a principal up-set). We show that these inequalities in fact hold for arbitrary up-sets, by modifying existing proofs to use only positive correlation, avoiding the need to assume positive correlation conditioned on one of the events.Comment: 5 pages. Added weighted varian

    The lower tail: Poisson approximation revisited

    Full text link
    The well-known "Janson's inequality" gives Poisson-like upper bounds for the lower tail probability \Pr(X \le (1-\eps)\E X) when X is the sum of dependent indicator random variables of a special form. We show that, for large deviations, this inequality is optimal whenever X is approximately Poisson, i.e., when the dependencies are weak. We also present correlation-based approaches that, in certain symmetric applications, yield related conclusions when X is no longer close to Poisson. As an illustration we, e.g., consider subgraph counts in random graphs, and obtain new lower tail estimates, extending earlier work (for the special case \eps=1) of Janson, Luczak and Rucinski.Comment: 21 page

    Preferential attachment without vertex growth: emergence of the giant component

    Full text link
    We study the following preferential attachment variant of the classical Erdos-Renyi random graph process. Starting with an empty graph on n vertices, new edges are added one-by-one, and each time an edge is chosen with probability roughly proportional to the product of the current degrees of its endpoints (note that the vertex set is fixed). We determine the asymptotic size of the giant component in the supercritical phase, confirming a conjecture of Pittel from 2010. Our proof uses a simple method: we condition on the vertex degrees (of a multigraph variant), and use known results for the configuration model.Comment: 20 page

    Sesqui-type branching processes

    Full text link
    We consider branching processes consisting of particles (individuals) of two types (type L and type S) in which only particles of type L have offspring, proving estimates for the survival probability and the (tail of) the distribution of the total number of particles. Such processes are in some sense closer to single- than to multi-type branching processes. Nonetheless, the second, barren, type complicates the analysis significantly. The results proved here (about point and survival probabilities) are a key ingredient in the analysis of bounded-size Achlioptas processes in a recent paper by the last two authors.Comment: 23 pages. References update

    Model-based testing for space-time interaction using point processes: An application to psychiatric hospital admissions in an urban area

    Full text link
    Spatio-temporal interaction is inherent to cases of infectious diseases and occurrences of earthquakes, whereas the spread of other events, such as cancer or crime, is less evident. Statistical significance tests of space-time clustering usually assess the correlation between the spatial and temporal (transformed) distances of the events. Although appealing through simplicity, these classical tests do not adjust for the underlying population nor can they account for a distance decay of interaction. We propose to use the framework of an endemic-epidemic point process model to jointly estimate a background event rate explained by seasonal and areal characteristics, as well as a superposed epidemic component representing the hypothesis of interest. We illustrate this new model-based test for space-time interaction by analysing psychiatric inpatient admissions in Zurich, Switzerland (2007-2012). Several socio-economic factors were found to be associated with the admission rate, but there was no evidence of general clustering of the cases.Comment: 21 pages including 4 figures and 5 tables; methods are implemented in the R package surveillance (https://CRAN.R-project.org/package=surveillance
    corecore