1,378 research outputs found

    A large deviation approach to super-critical bootstrap percolation on the random graph Gn,pG_{n,p}

    Full text link
    We consider the Erd\"{o}s--R\'{e}nyi random graph Gn,pG_{n,p} and we analyze the simple irreversible epidemic process on the graph, known in the literature as bootstrap percolation. We give a quantitative version of some results by Janson et al. (2012), providing a fine asymptotic analysis of the final size An∗A_n^* of active nodes, under a suitable super-critical regime. More specifically, we establish large deviation principles for the sequence of random variables {n−An∗f(n)}n≥1\{\frac{n- A_n^*}{f(n)}\}_{n\geq 1} with explicit rate functions and allowing the scaling function ff to vary in the widest possible range.Comment: 44 page

    How Much Can The Internet Be Greened?

    Get PDF
    Abstract—The power consumption of the Internet is becoming more and more a key issue, and several projects are studying how to reduce its energy consumption. In this paper, we provide a first evaluation of the amount of redundant resources (nodes and links) that can be powered off from a network topology to reduce power consumption. We first formulate a theoretical evaluation that exploits random graph theory to estimate the fraction of devices that can be removed from the topology still guaranteeing connectivity. Then we compare theoretical results with simulation results using realistic Internet topologies. Results, although preliminary, show that large energy savings can be achieved by accurately turning off nodes and links, e.g., during off-peak time. We show also that the non-cooperative design of the current Internet severely impacts the possible energy saving, suggesting that a cooperative approach can be investigated further. I

    Fully decentralized computation of aggregates over data streams

    Get PDF
    In several emerging applications, data is collected in massive streams at several distributed points of observation. A basic and challenging task is to allow every node to monitor a neighbourhood of interest by issuing continuous aggregate queries on the streams observed in its vicinity. This class of algorithms is fully decentralized and diffusive in nature: collecting all data at few central nodes of the network is unfeasible in networks of low capability devices or in the presence of massive data sets. The main difficulty in designing diffusive algorithms is to cope with duplicate detections. These arise both from the observation of the same event at several nodes of the network and/or receipt of the same aggregated information along multiple paths of diffusion. In this paper, we consider fully decentralized algorithms that answer locally continuous aggregate queries on the number of distinct events, total number of events and the second frequency moment in the scenario outlined above. The proposed algorithms use in the worst case or on realistic distributions sublinear space at every node. We also propose strategies that minimize the communication needed to update the aggregates when new events are observed. We experimentally evaluate for the efficiency and accuracy of our algorithms on realistic simulated scenarios

    MIMIC: a Multi Input Micro-Influencers Classifier

    Get PDF
    Micro-influencers are effective elements in the marketing strategies of companies and institutions because of their capability to create an hyper-engaged audience around a specific topic of interest. In recent years, many scientific approaches and commercial tools have handled the task of detecting this type of social media users. These strategies adopt solutions ranging from rule based machine learning models to deep neural networks and graph analysis on text, images and account information. This work compares the existing solutions and proposes an ensemble method to generalize them with different input data and social media platforms. The deployed solution combines deep learning models on unstructured data with statistical machine learning models on structured data. We retrieve both social media accounts information and multimedia posts on Twitter and Instagram. These data are mapped into feature vectors for an eXtreme Gradient Boosting (XGBoost) classifier. Sixty different topics have been analyzed to build a rule based gold standard dataset and to compare the performance of our approach against baseline classifiers. We prove the effectiveness of our work by comparing the accuracy, precision, recall, and f1 score of our model with different configurations and architectures. We obtained an accuracy of 0.98 with our best performing model

    Large deviations of the interference in the Ginibre network model

    Get PDF
    Under different assumptions on the distribution of the fading random variables, we derive large deviation estimates for the tail of the interference in a wireless network model whose nodes are placed, over a bounded region of the plane, according to the β\beta-Ginibre process, 0<β≤10<\beta\leq 1. The family of β\beta-Ginibre processes is formed by determinantal point processes, with different degree of repulsiveness, which converge in law to a homogeneous Poisson process, as β→0\beta \to 0. In this sense the Poisson network model may be considered as the limiting uncorrelated case of the β\beta-Ginibre network model. Our results indicate the existence of two different regimes. When the fading random variables are bounded or Weibull superexponential, large values of the interference are typically originated by the sum of several equivalent interfering contributions due to nodes in the vicinity of the receiver. In this case, the tail of the interference has, on the log-scale, the same asymptotic behavior for any value of 0<β≤10<\beta\le 1, but it differs (again on a log-scale) from the asymptotic behavior of the tail of the interference in the Poisson network model. When the fading random variables are exponential or subexponential, instead, large values of the interference are typically originated by a single dominating interferer node and, on the log-scale, the asymptotic behavior of the tail of the interference is essentially insensitive to the distribution of the nodes. As a consequence, on the log-scale, the asymptotic behavior of the tail of the interference in any β\beta-Ginibre network model, 0<β≤10<\beta\le 1, is the same as in the Poisson network model

    Modeling LEAST RECENTLY USED caches with Shot Noise request processes

    Get PDF
    In this paper we analyze Least Recently Used (LRU) caches operating under the Shot Noise requests Model (SNM). The SNM was recently proposed in [33] to better capture the main characteristics of today Video on Demand (VoD) traffic. We investigate the validity of Che's approximation through an asymptotic analysis of the cache eviction time. In particular, we provide a law of large numbers, a large deviation principle and a central limit theorem for the cache eviction time, as the cache size grows large. Finally, we derive upper and lower bounds for the ``hit" probability in tandem networks of caches under Che's approximation
    • …
    corecore