1,359 research outputs found

    The Boolean Model in the Shannon Regime: Three Thresholds and Related Asymptotics

    Full text link
    Consider a family of Boolean models, indexed by integers n1n \ge 1, where the nn-th model features a Poisson point process in Rn{\mathbb{R}}^n of intensity enρne^{n \rho_n} with ρnρ\rho_n \to \rho as nn \to \infty, and balls of independent and identically distributed radii distributed like Xˉnn\bar X_n \sqrt{n}, with Xˉn\bar X_n satisfying a large deviations principle. It is shown that there exist three deterministic thresholds: τd\tau_d the degree threshold; τp\tau_p the percolation threshold; and τv\tau_v the volume fraction threshold; such that asymptotically as nn tends to infinity, in a sense made precise in the paper: (i) for ρ<τd\rho < \tau_d, almost every point is isolated, namely its ball intersects no other ball; (ii) for τd<ρ<τp\tau_d< \rho< \tau_p, almost every ball intersects an infinite number of balls and nevertheless there is no percolation; (iii) for τp<ρ<τv\tau_p< \rho< \tau_v, the volume fraction is 0 and nevertheless percolation occurs; (iv) for τd<ρ<τv\tau_d< \rho< \tau_v, almost every ball intersects an infinite number of balls and nevertheless the volume fraction is 0; (v) for ρ>τv\rho > \tau_v, the whole space covered. The analysis of this asymptotic regime is motivated by related problems in information theory, and may be of interest in other applications of stochastic geometry

    The densest subgraph problem in sparse random graphs

    Get PDF
    We determine the asymptotic behavior of the maximum subgraph density of large random graphs with a prescribed degree sequence. The result applies in particular to the Erd\H{o}s-R\'{e}nyi model, where it settles a conjecture of Hajek [IEEE Trans. Inform. Theory 36 (1990) 1398-1414]. Our proof consists in extending the notion of balanced loads from finite graphs to their local weak limits, using unimodularity. This is a new illustration of the objective method described by Aldous and Steele [In Probability on Discrete Structures (2004) 1-72 Springer].Comment: Published at http://dx.doi.org/10.1214/14-AAP1091 in the Annals of Applied Probability (http://www.imstat.org/aap/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Information-Theoretic Capacity and Error Exponents of Stationary Point Processes under Random Additive Displacements

    Full text link
    This paper studies the Shannon regime for the random displacement of stationary point processes. Let each point of some initial stationary point process in Rn\R^n give rise to one daughter point, the location of which is obtained by adding a random vector to the coordinates of the mother point, with all displacement vectors independently and identically distributed for all points. The decoding problem is then the following one: the whole mother point process is known as well as the coordinates of some daughter point; the displacements are only known through their law; can one find the mother of this daughter point? The Shannon regime is that where the dimension nn tends to infinity and where the logarithm of the intensity of the point process is proportional to nn. We show that this problem exhibits a sharp threshold: if the sum of the proportionality factor and of the differential entropy rate of the noise is positive, then the probability of finding the right mother point tends to 0 with nn for all point processes and decoding strategies. If this sum is negative, there exist mother point processes, for instance Poisson, and decoding strategies, for instance maximum likelihood, for which the probability of finding the right mother tends to 1 with nn. We then use large deviations theory to show that in the latter case, if the entropy spectrum of the noise satisfies a large deviation principle, then the error probability goes exponentially fast to 0 with an exponent that is given in closed form in terms of the rate function of the noise entropy spectrum. This is done for two classes of mother point processes: Poisson and Mat\'ern. The practical interest to information theory comes from the explicit connection that we also establish between this problem and the estimation of error exponents in Shannon's additive noise channel with power constraints on the codewords

    On Marton's inner bound for broadcast channels

    Full text link
    Marton's inner bound is the best known achievable region for a general discrete memoryless broadcast channel. To compute Marton's inner bound one has to solve an optimization problem over a set of joint distributions on the input and auxiliary random variables. The optimizers turn out to be structured in many cases. Finding properties of optimizers not only results in efficient evaluation of the region, but it may also help one to prove factorization of Marton's inner bound (and thus its optimality). The first part of this paper formulates this factorization approach explicitly and states some conjectures and results along this line. The second part of this paper focuses primarily on the structure of the optimizers. This section is inspired by a new binary inequality that recently resulted in a very simple characterization of the sum-rate of Marton's inner bound for binary input broadcast channels. This prompted us to investigate whether this inequality can be extended to larger cardinality input alphabets. We show that several of the results for the binary input case do carry over for higher cardinality alphabets and we present a collection of results that help restrict the search space of probability distributions to evaluate the boundary of Marton's inner bound in the general case. We also prove a new inequality for the binary skew-symmetric broadcast channel that yields a very simple characterization of the entire Marton inner bound for this channel.Comment: Submitted to ISIT 201
    corecore