246 research outputs found
An Accurate Approximation of Resource Request Distributions in Millimeter Wave 3GPP New Radio Systems
The recently standardized millimeter wave-based 3GPP New Radio technology is
expected to become an enabler for both enhanced Mobile Broadband (eMBB) and
ultra-reliable low latency communication (URLLC) services specified to future
5G systems. One of the first steps in mathematical modeling of such systems is
the characterization of the session resource request probability mass function
(pmf) as a function of the channel conditions, cell size, application demands,
user location and system parameters including modulation and coding schemes
employed at the air interface. Unfortunately, this pmf cannot be expressed via
elementary functions. In this paper, we develop an accurate approximation of
the sought pmf. First, we show that Normal distribution provides a fairly
accurate approximation to the cumulative distribution function (CDF) of the
signal-to-noise ratio for communication systems operating in the millimeter
frequency band, further allowing evaluating the resource request pmf via error
function. We also investigate the impact of shadow fading on the resource
request pmf.Comment: The 19th International Conference on Next Generation Wired/Wireless
Networks and Systems (New2An 2019
Fast Cell Discovery in mm-wave 5G Networks with Context Information
The exploitation of mm-wave bands is one of the key-enabler for 5G mobile
radio networks. However, the introduction of mm-wave technologies in cellular
networks is not straightforward due to harsh propagation conditions that limit
the mm-wave access availability. Mm-wave technologies require high-gain antenna
systems to compensate for high path loss and limited power. As a consequence,
directional transmissions must be used for cell discovery and synchronization
processes: this can lead to a non-negligible access delay caused by the
exploration of the cell area with multiple transmissions along different
directions.
The integration of mm-wave technologies and conventional wireless access
networks with the objective of speeding up the cell search process requires new
5G network architectural solutions. Such architectures introduce a functional
split between C-plane and U-plane, thereby guaranteeing the availability of a
reliable signaling channel through conventional wireless technologies that
provides the opportunity to collect useful context information from the network
edge.
In this article, we leverage the context information related to user
positions to improve the directional cell discovery process. We investigate
fundamental trade-offs of this process and the effects of the context
information accuracy on the overall system performance. We also cope with
obstacle obstructions in the cell area and propose an approach based on a
geo-located context database where information gathered over time is stored to
guide future searches. Analytic models and numerical results are provided to
validate proposed strategies.Comment: 14 pages, submitted to IEEE Transaction on Mobile Computin
End-to-End Simulation of 5G mmWave Networks
Due to its potential for multi-gigabit and low latency wireless links,
millimeter wave (mmWave) technology is expected to play a central role in 5th
generation cellular systems. While there has been considerable progress in
understanding the mmWave physical layer, innovations will be required at all
layers of the protocol stack, in both the access and the core network.
Discrete-event network simulation is essential for end-to-end, cross-layer
research and development. This paper provides a tutorial on a recently
developed full-stack mmWave module integrated into the widely used open-source
ns--3 simulator. The module includes a number of detailed statistical channel
models as well as the ability to incorporate real measurements or ray-tracing
data. The Physical (PHY) and Medium Access Control (MAC) layers are modular and
highly customizable, making it easy to integrate algorithms or compare
Orthogonal Frequency Division Multiplexing (OFDM) numerologies, for example.
The module is interfaced with the core network of the ns--3 Long Term Evolution
(LTE) module for full-stack simulations of end-to-end connectivity, and
advanced architectural features, such as dual-connectivity, are also available.
To facilitate the understanding of the module, and verify its correct
functioning, we provide several examples that show the performance of the
custom mmWave stack as well as custom congestion control algorithms designed
specifically for efficient utilization of the mmWave channel.Comment: 25 pages, 16 figures, submitted to IEEE Communications Surveys and
Tutorials (revised Jan. 2018
Resource Queuing System with Preemptive Priority for Performance Analysis of 5G NR Systems
One of the ways to enable smooth coexistence of ultra reliable low latency communication (URRLC) and enhances mobile broadband (eMBB) services at the air interface of perspective 5G New Radio (NR) technology is to utilize preemptive priority service. In this paper, we provide approximate analysis of the queuing system with random resource requirements, two types of customers and preemptive priority service procedure. The distinctive feature of the systems – the random resource requirements – allows to capture the essentials of 5G NR radio interface but inherently increases the complexity of analysis. We present the main performance metrics of interest including session drop probability and system resource utilization as well as assess their accuracy by comparing with computer simulations. The developed model is not inherently limited to URLLC and eMBB coexistence and can be utilized in performance evaluation of 5G NR systems with priority-based service discipline at the air interface, e.g., in context of network slicing. Among other conclusions we explicitly show that both session drop and interruption probabilities of low priority traffic heavily depend not only on the intensity of high priority traffic but on stochastic characteristics of the resource request distribution.acceptedVersionPeer reviewe
On the Temporal Effects of Mobile Blockers in Urban Millimeter-Wave Cellular Scenarios
Millimeter-wave (mmWave) propagation is known to be severely affected by the
blockage of the line-of-sight (LoS) path. In contrast to microwave systems, at
shorter mmWave wavelengths such blockage can be caused by human bodies, where
their mobility within environment makes wireless channel alternate between the
blocked and non-blocked LoS states. Following the recent 3GPP requirements on
modeling the dynamic blockage as well as the temporal consistency of the
channel at mmWave frequencies, in this paper a new model for predicting the
state of a user in the presence of mobile blockers for representative 3GPP
scenarios is developed: urban micro cell (UMi) street canyon and
park/stadium/square. It is demonstrated that the blockage effects produce an
alternating renewal process with exponentially distributed non-blocked
intervals, and blocked durations that follow the general distribution. The
following metrics are derived (i) the mean and the fraction of time spent in
blocked/non-blocked state, (ii) the residual blocked/non-blocked time, and
(iii) the time-dependent conditional probability of having blockage/no blockage
at time t1 given that there was blockage/no blockage at time t0. The latter is
a function of the arrival rate (intensity), width, and height of moving
blockers, distance to the mmWave access point (AP), as well as the heights of
the AP and the user device. The proposed model can be used for system-level
characterization of mmWave cellular communication systems. For example, the
optimal height and the maximum coverage radius of the mmWave APs are derived,
while satisfying the required mean data rate constraint. The system-level
simulations corroborate that the use of the proposed method considerably
reduces the modeling complexity.Comment: Accepted, IEEE Transactions on Vehicular Technolog
Recommended from our members
Improving next-generation wireless network performance and reliability with deep learning
A rudimentary question whether machine learning in general, or deep learning in particular, could add to the well-established field of wireless communications, which has been evolving for close to a century, is often raised. While the use of deep learning based methods is likely to help build intelligent wireless solutions, this use becomes particularly challenging for the lower layers in the wireless communication stack. The introduction of the fifth generation of wireless communications (5G) has triggered the demand for “network intelligence” to support its promises for very high data rates and extremely low latency. Consequently, 5G wireless operators are faced with the challenges of network complexity, diversification of services, and personalized user experience. Industry standards have created enablers (such as the network data analytics function), but these enablers focus on post-mortem analysis at higher stack layers and have a periodicity in the time scale of seconds (or larger). The goal of this dissertation is to show a solution for these challenges and how a data-driven approach using deep learning could add to the field of wireless communications. In particular, I propose intelligent predictive and prescriptive abilities to boost reliability and eliminate performance bottlenecks in 5G cellular networks and beyond, show contributions that justify the value of deep learning in wireless communications across several different layers, and offer in-depth analysis and comparisons with baselines and industry standards. First, to improve multi-antenna network reliability against wireless impairments with power control and interference coordination for both packetized voice and beamformed data bearers, I propose the use of a joint beamforming, power control, and interference coordination algorithm based on deep reinforcement learning. This algorithm uses a string of bits and logic operations to enable simultaneous actions to be performed by the reinforcement learning agent. Consequently, a joint reward function is also proposed. I compare the performance of my proposed algorithm with the brute force approach and show that similar performance is achievable but with faster run-time as the number of transmit antennas increases. Second, in enhancing the performance of coordinated multipoint, I propose the use of deep learning binary classification to learn a surrogate function to trigger a second transmission stream instead of depending on the popular signal to interference plus noise measurement quantity. This surrogate function improves the users' sum-rate through focusing on pre-logarithmic terms in the sum-rate formula, which have larger impact on this rate. Third, performance of band switching can be improved without the need for a full channel estimation. My proposal of using deep learning to classify the quality of two frequency bands prior to granting the band switching leads to a significant improvement in users' throughput. This is due to the elimination of the industry standard measurement gap requirement—a period of silence where no data is sent to the users so they could measure the frequency bands before switching. In this dissertation, a group of algorithms for wireless network performance and reliability for downlink are proposed. My results show that the introduction of user coordinates enhance the accuracy of the predictions made with deep learning. Also, the choice of signal to interference plus noise ratio as the optimization objective may not always be the best choice to improve user throughput rates. Further, exploiting the spatial correlation of channels in different frequency bands can improve certain network procedures without the need for perfect knowledge of the per-band channel state information. Hence, an understanding of these results help develop novel solutions to enhancing these wireless networks at a much smaller time scale compared to the industry standards todayElectrical and Computer Engineerin
A Tutorial on Mathematical Modeling of 5G/6G Millimeter Wave and Terahertz Cellular Systems
Millimeter wave (mmWave) and terahertz (THz) radio access technologies (RAT) are expected to become a critical part of the future cellular ecosystem providing an abundant amount of bandwidth in areas with high traffic demands. However, extremely directional antenna radiation patterns that need to be utilized at both transmit and receive sides of a link to overcome severe path losses, dynamic blockage of propagation paths by large static and small dynamic objects, macro-and micromobility of user equipment (UE) makes provisioning of reliable service over THz/mmWave RATs an extremely complex task. This challenge is further complicated by the type of applications envisioned for these systems inherently requiring guaranteed bitrates at the air interface. This tutorial aims to introduce a versatile mathematical methodology for assessing performance reliability improvement algorithms for mmWave and THz systems. Our methodology accounts for both radio interface specifics as well as service process of sessions at mmWave/THz base stations (BS) and is capable of evaluating the performance of systems with multiconnectivity operation, resource reservation mechanisms, priorities between multiple traffic types having different service requirements. The framework is logically separated into two parts: (i) parameterization part that abstracts the specifics of deployment and radio mechanisms, and (ii) queuing part, accounting for details of the service process at mmWave/THz BSs. The modular decoupled structure of the framework allows for further extensions to advanced service mechanisms in prospective mmWave/THz cellular deployments while keeping the complexity manageable and thus making it attractive for system analysts.publishedVersionPeer reviewe
A Tutorial on Mathematical Modeling of 5G/6G Millimeter Wave and Terahertz Cellular Systems
Millimeter wave (mmWave) and terahertz (THz) radio access technologies (RAT) are expected to become a critical part of the future cellular ecosystem providing an abundant amount of bandwidth in areas with high traffic demands. However, extremely directional antenna radiation patterns that need to be utilized at both transmit and receive sides of a link to overcome severe path losses, dynamic blockage of propagation paths by large static and small dynamic objects, macro-and micromobility of user equipment (UE) makes provisioning of reliable service over THz/mmWave RATs an extremely complex task. This challenge is further complicated by the type of applications envisioned for these systems inherently requiring guaranteed bitrates at the air interface. This tutorial aims to introduce a versatile mathematical methodology for assessing performance reliability improvement algorithms for mmWave and THz systems. Our methodology accounts for both radio interface specifics as well as service process of sessions at mmWave/THz base stations (BS) and is capable of evaluating the performance of systems with multiconnectivity operation, resource reservation mechanisms, priorities between multiple traffic types having different service requirements. The framework is logically separated into two parts: (i) parameterization part that abstracts the specifics of deployment and radio mechanisms, and (ii) queuing part, accounting for details of the service process at mmWave/THz BSs. The modular decoupled structure of the framework allows for further extensions to advanced service mechanisms in prospective mmWave/THz cellular deployments while keeping the complexity manageable and thus making it attractive for system analysts.publishedVersionPeer reviewe
- …