183 research outputs found
MAC Aspects of Millimeter-Wave Cellular Networks
The current demands for extremely high data rate wireless services and the spectrum scarcity at the sub-6 GHz bands are forcefully motivating the use of the millimeter-wave (mmWave) frequencies. MmWave communications are characterized by severe attenuation, sparse-scattering environment, large bandwidth, high penetration loss, beamforming with massive antenna arrays, and possible noise-limited operation. These characteristics imply a major difference with respect to legacy communication technologies, primarily designed for the sub-6 GHz bands, and are posing major design challenges on medium access control (MAC) layer. This book chapter discusses key MAC layer issues at the initial access and mobility management (e.g., synchronization, random access, and handover) as well as resource allocation (interference management, scheduling, and association). The chapter provides an integrated view on MAC layer issues for cellular networks and reviews the main challenges and trade-offs and the state-of-the-art proposals to address them
Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions
The ever-increasing number of resource-constrained Machine-Type Communication
(MTC) devices is leading to the critical challenge of fulfilling diverse
communication requirements in dynamic and ultra-dense wireless environments.
Among different application scenarios that the upcoming 5G and beyond cellular
networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the
unique technical challenge of supporting a huge number of MTC devices, which is
the main focus of this paper. The related challenges include QoS provisioning,
handling highly dynamic and sporadic MTC traffic, huge signalling overhead and
Radio Access Network (RAN) congestion. In this regard, this paper aims to
identify and analyze the involved technical issues, to review recent advances,
to highlight potential solutions and to propose new research directions. First,
starting with an overview of mMTC features and QoS provisioning issues, we
present the key enablers for mMTC in cellular networks. Along with the
highlights on the inefficiency of the legacy Random Access (RA) procedure in
the mMTC scenario, we then present the key features and channel access
mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT.
Subsequently, we present a framework for the performance analysis of
transmission scheduling with the QoS support along with the issues involved in
short data packet transmission. Next, we provide a detailed overview of the
existing and emerging solutions towards addressing RAN congestion problem, and
then identify potential advantages, challenges and use cases for the
applications of emerging Machine Learning (ML) techniques in ultra-dense
cellular networks. Out of several ML techniques, we focus on the application of
low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss
some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future
publication in IEEE Communications Surveys and Tutorial
Optimal and Approximation Algorithms for Joint Routing and Scheduling in Millimeter-Wave Cellular Networks
Millimeter-wave (mmWave) communication is a promising technology to cope with
the exponential increase in 5G data traffic.
Such networks typically require a very dense deployment of base stations.
A subset of those, so-called macro base stations, feature high-bandwidth
connection to the core network, while relay base stations are connected
wirelessly.
To reduce cost and increase flexibility, wireless backhauling is needed to
connect both macro to relay as well as relay to relay base stations.
The characteristics of mmWave communication mandates new paradigms for
routing and scheduling.
The paper investigates scheduling algorithms under different interference
models.
To showcase the scheduling methods, we study the maximum throughput fair
scheduling problem. Yet the proposed algorithms can be easily extended to other
problems.
For a full-duplex network under the no interference model, we propose an
efficient polynomial-time scheduling method, the {\em schedule-oriented
optimization}. Further, we prove that the problem is NP-hard if we assume
pairwise link interference model or half-duplex radios.
Fractional weighted coloring based approximation algorithms are proposed for
these NP-hard cases.
Moreover, the approximation algorithm parallel data stream scheduling is
proposed for the case of half-duplex network under the no interference model.
It has better approximation ratio than the fractional weighted coloring based
algorithms and even attains the optimal solution for the special case of
uniform orthogonal backhaul networks.Comment: accepted for publish in the IEEE/ACM Transactions on Networkin
- …