681 research outputs found
Millimeter Wave Cellular Networks: A MAC Layer Perspective
The millimeter wave (mmWave) frequency band is seen as a key enabler of
multi-gigabit wireless access in future cellular networks. In order to overcome
the propagation challenges, mmWave systems use a large number of antenna
elements both at the base station and at the user equipment, which lead to high
directivity gains, fully-directional communications, and possible noise-limited
operations. The fundamental differences between mmWave networks and traditional
ones challenge the classical design constraints, objectives, and available
degrees of freedom. This paper addresses the implications that highly
directional communication has on the design of an efficient medium access
control (MAC) layer. The paper discusses key MAC layer issues, such as
synchronization, random access, handover, channelization, interference
management, scheduling, and association. The paper provides an integrated view
on MAC layer issues for cellular networks, identifies new challenges and
tradeoffs, and provides novel insights and solution approaches.Comment: 21 pages, 9 figures, 2 tables, to appear in IEEE Transactions on
Communication
LTE and Millimeter Waves for V2I Communications: an End-to-End Performance Comparison
The Long Term Evolution (LTE) standard enables, besides cellular
connectivity, basic automotive services to promote road safety through
vehicle-to-infrastructure (V2I) communications. Nevertheless, stakeholders and
research institutions, driven by the ambitious technological advances expected
from fully autonomous and intelligent transportation systems, have recently
investigated new radio technologies as a means to support vehicular
applications. In particular, the millimeter wave (mmWave) spectrum holds great
promise because of the large available bandwidth that may provide the required
link capacity. Communications at high frequencies, however, suffer from severe
propagation and absorption loss, which may cause communication disconnections
especially considering high mobility scenarios. It is therefore important to
validate, through simulations, the actual feasibility of establishing V2I
communications in the above-6 GHz bands. Following this rationale, in this
paper we provide the first comparative end-to-end evaluation of the performance
of the LTE and mmWave technologies in a vehicular scenario. The simulation
framework includes detailed measurement-based channel models as well as the
full details of MAC, RLC and transport protocols. Our results show that,
although LTE still represents a promising access solution to guarantee robust
and fair connections, mmWaves satisfy the foreseen extreme throughput demands
of most emerging automotive applications.Comment: 7 pages, 5 figures, 2 tables. Accepted to VTC-Spring 2019, workshop
on High Mobility Wireless Communications (HMWC
Low-latency Networking: Where Latency Lurks and How to Tame It
While the current generation of mobile and fixed communication networks has
been standardized for mobile broadband services, the next generation is driven
by the vision of the Internet of Things and mission critical communication
services requiring latency in the order of milliseconds or sub-milliseconds.
However, these new stringent requirements have a large technical impact on the
design of all layers of the communication protocol stack. The cross layer
interactions are complex due to the multiple design principles and technologies
that contribute to the layers' design and fundamental performance limitations.
We will be able to develop low-latency networks only if we address the problem
of these complex interactions from the new point of view of sub-milliseconds
latency. In this article, we propose a holistic analysis and classification of
the main design principles and enabling technologies that will make it possible
to deploy low-latency wireless communication networks. We argue that these
design principles and enabling technologies must be carefully orchestrated to
meet the stringent requirements and to manage the inherent trade-offs between
low latency and traditional performance metrics. We also review currently
ongoing standardization activities in prominent standards associations, and
discuss open problems for future research
RIS-Assisted Coverage Enhancement in Millimeter-Wave Cellular Networks
The use of millimeter-wave (mmWave) bandwidth is one key enabler to achieve
the high data rates in the fifth-generation (5G) cellular systems. However,
mmWave signals suffer from significant path loss due to high directivity and
sensitivity to blockages, limiting its adoption within small-scale deployments.
To enhance the coverage of mmWave communication in 5G and beyond, it is
promising to deploy a large number of reconfigurable intelligent surfaces
(RISs) that passively reflect mmWave signals towards desired directions. With
this motivation, in this work we study the coverage of an RIS-assisted
large-scale mmWave cellular network using stochastic geometry, and derive the
peak reflection power expression of an RIS and the downlink
signal-to-interference ratio (SIR) coverage expression in closed forms. These
analytic results clarify the effectiveness of deploying RISs in the mmWave SIR
coverage enhancement, while unveiling the major role of the density ratio
between active base stations (BSs) and passive RISs. Furthermore, the results
show that deploying passive reflectors is as effective as equipping BSs with
more active antennas in the mmWave coverage enhancement. Simulation results
confirm the tightness of the closed form expressions, corroborating our major
findings based on the derived expressions.Comment: Accepted in IEEE ACCESS, Copyright (c) 2015 IEEE. Personal use of
this material is permitted. However, permission to use this material for any
other purposes must be obtained from the IEEE by sending a request to
[email protected]
Facilitating Internet of Things on the Edge
The evolution of electronics and wireless technologies has entered a new era, the Internet of Things (IoT). Presently, IoT technologies influence the global market, bringing benefits in many areas, including healthcare, manufacturing, transportation, and entertainment.
Modern IoT devices serve as a thin client with data processing performed in a remote computing node, such as a cloud server or a mobile edge compute unit. These computing units own significant resources that allow prompt data processing. The user experience for such an approach relies drastically on the availability and quality of the internet connection. In this case, if the internet connection is unavailable, the resulting operations of IoT applications can be completely disrupted. It is worth noting that emerging IoT applications are even more throughput demanding and latency-sensitive which makes communication networks a practical bottleneck for the service provisioning. This thesis aims to eliminate the limitations of wireless access, via the improvement of connectivity and throughput between the devices on the edge, as well as their network identification, which is fundamentally important for IoT service management.
The introduction begins with a discussion on the emerging IoT applications and their demands. Subsequent chapters introduce scenarios of interest, describe the proposed solutions and provide selected performance evaluation results. Specifically, we start with research on the use of degraded memory chips for network identification of IoT devices as an alternative to conventional methods, such as IMEI; these methods are not vulnerable to tampering and cloning. Further, we introduce our contributions for improving connectivity and throughput among IoT devices on the edge in a case where the mobile network infrastructure is limited or totally unavailable. Finally, we conclude the introduction with a summary of the results achieved
Stochastic Geometric Coverage Analysis in mmWave Cellular Networks With Realistic Channel and Antenna Radiation Models
Millimeter-wave (mmWave) bands will play an important role in 5G wireless systems. The system performance can be assessed by using models from stochastic geometry that cater for the directivity in the desired signal transmissions as well as the interference, and by calculating the signal-To-interference-plus-noise ratio ( \mathsf {SINR} ) coverage. Nonetheless, the accuracy of the existing coverage expressions derived through stochastic geometry may be questioned, as it is not clear whether they would capture the impact of the detailed mmWave channel and antenna features. In this paper, we propose an \mathsf {SINR} coverage analysis framework that includes realistic channel model and antenna element radiation patterns. We introduce and estimate two parameters, aligned gain and misaligned gain, associated with the desired signal beam and the interfering signal beam, respectively. The distributions of these gains are used to determine the distribution of the \mathsf {SINR} which is compared with the corresponding \mathsf {SINR} coverage, calculated through the system-level simulations. The results show that both aligned and misaligned gains can be modeled as exponential-logarithmically distributed random variables with the highest accuracy, and can further be approximated as exponentially distributed random variables with reasonable accuracy. These approximations can be used as a tool to evaluate the system-level performance of various 5G connectivity scenarios in the mmWave band.</p
- …