3 research outputs found

    On the Distribution of Random Geometric Graphs

    Get PDF
    Random geometric graphs (RGGs) are commonly used to model networked systems that depend on the underlying spatial embedding. We concern ourselves with the probability distribution of an RGG, which is crucial for studying its random topology, properties (e.g., connectedness), or Shannon entropy as a measure of the graph's topological uncertainty (or information content). Moreover, the distribution is also relevant for determining average network performance or designing protocols. However, a major impediment in deducing the graph distribution is that it requires the joint probability distribution of the n(nβˆ’1)/2n(n-1)/2 distances between nn nodes randomly distributed in a bounded domain. As no such result exists in the literature, we make progress by obtaining the joint distribution of the distances between three nodes confined in a disk in R2\mathbb{R}^2. This enables the calculation of the probability distribution and entropy of a three-node graph. For arbitrary nn, we derive a series of upper bounds on the graph entropy; in particular, the bound involving the entropy of a three-node graph is tighter than the existing bound which assumes distances are independent. Finally, we provide numerical results on graph connectedness and the tightness of the derived entropy bounds.Comment: submitted to the IEEE International Symposium on Information Theory 201

    Quantifying Link Stability in Ad Hoc Wireless Networks Subject to Ornstein-Uhlenbeck Mobility

    Full text link
    The performance of mobile ad hoc networks in general and that of the routing algorithm, in particular, can be heavily affected by the intrinsic dynamic nature of the underlying topology. In this paper, we build a new analytical/numerical framework that characterizes nodes' mobility and the evolution of links between them. This formulation is based on a stationary Markov chain representation of link connectivity. The existence of a link between two nodes depends on their distance, which is governed by the mobility model. In our analysis, nodes move randomly according to an Ornstein-Uhlenbeck process using one tuning parameter to obtain different levels of randomness in the mobility pattern. Finally, we propose an entropy-rate-based metric that quantifies link uncertainty and evaluates its stability. Numerical results show that the proposed approach can accurately reflect the random mobility in the network and fully captures the link dynamics. It may thus be considered a valuable performance metric for the evaluation of the link stability and connectivity in these networks.Comment: 6 pages, 4 figures, Submitted to IEEE International Conference on Communications 201

    On the distribution of random geometric graphs

    No full text
    Random geometric graphs (RGGs) are commonly used to model networked systems that depend on the underlying spatial embedding. We concern ourselves with the probability distribution of an RGG, which is crucial for studying its random topology, properties (e.g., connectedness), or Shannon entropy as a measure of the graph's topological uncertainty (or information content). Moreover, the distribution is also relevant for determining average network performance or designing protocols. However, a major impediment in deducing the graph distribution is that it requires the joint probability distribution of the n (n -1)/2 distances between n nodes randomly distributed in a bounded domain. As no such result exists in the literature, we make progress by obtaining the joint distribution of the distances between three nodes confined in a disk in \mathbbR 2 . This enables the calculation of the probability distribution and entropy of a three-node graph. For arbitrary n, we derive a series of upper bounds on the graph entropy; in particular, the bound involving the entropy of a three-node graph is tighter than the existing bound which assumes distances are independent. Finally, we provide numerical results on graph connectedness and the tightness of the derived entropy bounds
    corecore