216 research outputs found

    Disruptive events in high-density cellular networks

    Get PDF
    Stochastic geometry models are used to study wireless networks, particularly cellular phone networks, but most of the research focuses on the typical user, often ignoring atypical events, which can be highly disruptive and of interest to network operators. We examine atypical events when a unexpected large proportion of users are disconnected or connected by proposing a hybrid approach based on ray launching simulation and point process theory. This work is motivated by recent results using large deviations theory applied to the signal-to-interference ratio. This theory provides a tool for the stochastic analysis of atypical but disruptive events, particularly when the density of transmitters is high. For a section of a European city, we introduce a new stochastic model of a single network cell that uses ray launching data generated with the open source RaLaNS package, giving deterministic path loss values. We collect statistics on the fraction of (dis)connected users in the uplink, and observe that the probability of an unexpected large proportion of disconnected users decreases exponentially when the transmitter density increases. This observation implies that denser networks become more stable in the sense that the probability of the fraction of (dis)connected users deviating from its mean, is exponentially small. We also empirically obtain and illustrate the density of users for network configurations in the disruptive event, which highlights the fact that such bottleneck behaviour not only stems from too many users at the cell boundary, but also from the near-far effect of many users in the immediate vicinity of the base station. We discuss the implications of these findings and outline possible future research directions.Comment: 8 pages, 11 figure

    Disruptive events in high-density cellular networks

    Get PDF
    Stochastic geometry models are used to study wireless networks, particularly cellular phone networks, but most of the research focuses on the typical user, often ignoring atypical events, which can be highly disruptive and of interest to network operators. We examine atypical events when a unexpected large proportion of users are disconnected or connected by proposing a hybrid approach based on ray launching simulation and point process theory. This work is motivated by recent results [12] using large deviations theory applied to the signal-to-interference ratio. This theory provides a tool for the stochastic analysis of atypical but disruptive events, particularly when the density of transmitters is high. For a section of a European city, we introduce a new stochastic model of a single network cell that uses ray launching data generated with the open source RaLaNS package, giving deterministic path loss values. We collect statistics on the fraction of (dis)connected users in the uplink, and observe that the probability of an unexpected large proportion of disconnected users decreases exponentially when the transmitter density increases. This observation implies that denser networks become more stable in the sense that the probability of the fraction of (dis)connected users deviating from its mean, is exponentially small. We also empirically obtain and illustrate the density of users for network configurations in the disruptive event, which highlights the fact that such bottleneck behaviour not only stems from too many users at the cell boundary, but also from the near-far effect of many users in the immediate vicinity of the base station. We discuss the implications of these findings and outline possible future research directions

    Disruptive events in high-density cellular networks

    Get PDF
    Stochastic geometry models are used to study wireless networks, particularly cellular phone networks, but most of the research focuses on the typical user, often ignoring atypical events, which can be highly disruptive and of interest to network operators. We examine atypical events when a unexpected large proportion of users are disconnected or connected by proposing a hybrid approach based on ray launching simulation and point process theory. This work is motivated by recent results [12] using large deviations theory applied to the signal-to-interference ratio. This theory provides a tool for the stochastic analysis of atypical but disruptive events, particularly when the density of transmitters is high. For a section of a European city, we introduce a new stochastic model of a single network cell that uses ray launching data generated with the open source RaLaNS package, giving deterministic path loss values. We collect statistics on the fraction of (dis)connected users in the uplink, and observe that the probability of an unexpected large proportion of disconnected users decreases exponentially when the transmitter density increases. This observation implies that denser networks become more stable in the sense that the probability of the fraction of (dis)connected users deviating from its mean, is exponentially small. We also empirically obtain and illustrate the density of users for network configurations in the disruptive event, which highlights the fact that such bottleneck behaviour not only stems from too many users at the cell boundary, but also from the near-far effect of many users in the immediate vicinity of the base station. We discuss the implications of these findings and outline possible future research directions

    Role of Interference and Computational Complexity in Modern Wireless Networks: Analysis, Optimization, and Design

    Get PDF
    Owing to the popularity of smartphones, the recent widespread adoption of wireless broadband has resulted in a tremendous growth in the volume of mobile data traffic, and this growth is projected to continue unabated. In order to meet the needs of future systems, several novel technologies have been proposed, including cooperative communications, cloud radio access networks (RANs) and very densely deployed small-cell networks. For these novel networks, both interference and the limited availability of computational resources play a very important role. Therefore, the accurate modeling and analysis of interference and computation is essential to the understanding of these networks, and an enabler for more efficient design.;This dissertation focuses on four aspects of modern wireless networks: (1) Modeling and analysis of interference in single-hop wireless networks, (2) Characterizing the tradeoffs between the communication performance of wireless transmission and the computational load on the systems used to process such transmissions, (3) The optimization of wireless multiple-access networks when using cost functions that are based on the analytical findings in this dissertation, and (4) The analysis and optimization of multi-hop networks, which may optionally employ forms of cooperative communication.;The study of interference in single-hop wireless networks proceeds by assuming that the random locations of the interferers are drawn from a point process and possibly constrained to a finite area. Both the information-bearing and interfering signals propagate over channels that are subject to path loss, shadowing, and fading. A flexible model for fading, based on the Nakagami distribution, is used, though specific examples are provided for Rayleigh fading. The analysis is broken down into multiple steps, involving subsequent averaging of the performance metrics over the fading, the shadowing, and the location of the interferers with the aim to distinguish the effect of these mechanisms that operate over different time scales. The analysis is extended to accommodate diversity reception, which is important for the understanding of cooperative systems that combine transmissions that originate from different locations. Furthermore, the role of spatial correlation is considered, which provides insight into how the performance in one location is related to the performance in another location.;While it is now generally understood how to communicate close to the fundamental limits implied by information theory, operating close to the fundamental performance bounds is costly in terms of the computational complexity required to receive the signal. This dissertation provides a framework for understanding the tradeoffs between communication performance and the imposed complexity based on how close a system operates to the performance bounds, and it allows to accurately estimate the required data processing resources of a network under a given performance constraint. The framework is applied to Cloud-RAN, which is a new cellular architecture that moves the bulk of the signal processing away from the base stations (BSs) and towards a centralized computing cloud. The analysis developed in this part of the dissertation helps to illuminate the benefits of pooling computing assets when decoding multiple uplink signals in the cloud. Building upon these results, new approaches for wireless resource allocation are proposed, which unlike previous approaches, are aware of the computing limitations of the network.;By leveraging the accurate expressions that characterize performance in the presence of interference and fading, a methodology is described for optimizing wireless multiple-access networks. The focus is on frequency hopping (FH) systems, which are already widely used in military systems, and are becoming more common in commercial systems. The optimization determines the best combination of modulation parameters (such as the modulation index for continuous-phase frequency-shift keying), number of hopping channels, and code rate. In addition, it accounts for the adjacent-channel interference (ACI) and determines how much of the signal spectrum should lie within the operating band of each channel, and how much can be allowed to splatter into adjacent channels.;The last part of this dissertation contemplates networks that involve multi-hop communications. Building on the analytical framework developed in early parts of this dissertation, the performance of such networks is analyzed in the presence of interference and fading, and it is introduced a novel paradigm for a rapid performance assessment of routing protocols. Such networks may involve cooperative communications, and the particular cooperative protocol studied here allows the same packet to be transmitted simultaneously by multiple transmitters and diversity combined at the receiver. The dynamics of how the cooperative protocol evolves over time is described through an absorbing Markov chain, and the analysis is able to efficiently capture the interference that arises as packets are periodically injected into the network by a common source, the temporal correlation among these packets and their interdependence
    corecore