4 research outputs found

    Distributed Medium Access Control in Wireless Networks

    No full text

    Distributed Medium Access Control in Wireless Networks

    No full text

    Delay performance of backlog-based random access

    No full text
    Backlog-based CSMA strategies provide a popular mechanism for distributed medium access control in wireless networks. When suitably designed, such strategies offer the striking capability to match the optimal throughput performance of centralized scheduling algorithms in a wide range of scenarios. Unfortunately, however, the activation rules used in these schemes tend to yield excessive backlogs and delays. More aggressive activation rates can potentially improve the delay performance, but may not allow provable maximum-stability guarantees. In order to gain a fundamental understanding how the shape of the activation function affects the queueing behavior, we focus on a single- node scenario, thus separating the impact of the network topology. We demonstrate that three qualitatively different regimes can arise, depending on how rapidly the activation function increases with the backlog. Simulation experiments are conducted to validate the analytical findings

    Spatial mean-field limits for ultra-dense random-access networks

    No full text
    Random-access algorithms such as the CSMA protocol provide a popular mechanism for distributed medium access control in wireless networks. In saturated-buer scenarios the joint activity process in such random-access networks has a product-form stationary distribution which provides useful throughput estimates for persistent traffic flows. However, these results do not capture the relevant performance metrics in unsaturated-buer scenarios, which in particular arise in an IoT context with highly intermittent traffic sources. Mean-field analysis has emerged as a powerful approach to obtain tractable performance estimates in such situations, and is not only mathematically convenient, but also relevant as wireless networks grow larger and denser with the emergence of IoT applications. A crucial requirement for the classical mean-field framework to apply however is that the node population can be partitioned into a finite number of classes of statistically indistinguishable nodes. The latter condition is a severe restriction since nodes typically have dierent locations and hence experience dierent interference constraints. Motivated by the above observations, we develop in the present paper a novel mean-field methodology which does not rely on any exchangeability property. Since the spatio-temporal evolution of the network can no longer be described through a finite-dimensional population process, we adopt a measure-valued state description, and prove that the latter converges to a deterministic limit as the network grows large and dense. The limit process is characterized in terms of a system of partial-dierential equations, which exhibit a striking local-global-interaction and time scale separation property. Specifically, the queueing dynamics at any given node are only aected by the global network state through a single parsimonious quantity. The latter quantity corresponds to the fraction of time that no activity occurs within the interference range of that particular node in case of a certain static spatial activation measure. Extensive simulation experiments demonstrate that the solution of the partial-dierential equations yields remarkably accurate approximations for the queue length distributions and delay metrics, even when the number of nodes is fairly moderate
    corecore