582 research outputs found
Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions
The ever-increasing number of resource-constrained Machine-Type Communication
(MTC) devices is leading to the critical challenge of fulfilling diverse
communication requirements in dynamic and ultra-dense wireless environments.
Among different application scenarios that the upcoming 5G and beyond cellular
networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the
unique technical challenge of supporting a huge number of MTC devices, which is
the main focus of this paper. The related challenges include QoS provisioning,
handling highly dynamic and sporadic MTC traffic, huge signalling overhead and
Radio Access Network (RAN) congestion. In this regard, this paper aims to
identify and analyze the involved technical issues, to review recent advances,
to highlight potential solutions and to propose new research directions. First,
starting with an overview of mMTC features and QoS provisioning issues, we
present the key enablers for mMTC in cellular networks. Along with the
highlights on the inefficiency of the legacy Random Access (RA) procedure in
the mMTC scenario, we then present the key features and channel access
mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT.
Subsequently, we present a framework for the performance analysis of
transmission scheduling with the QoS support along with the issues involved in
short data packet transmission. Next, we provide a detailed overview of the
existing and emerging solutions towards addressing RAN congestion problem, and
then identify potential advantages, challenges and use cases for the
applications of emerging Machine Learning (ML) techniques in ultra-dense
cellular networks. Out of several ML techniques, we focus on the application of
low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss
some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future
publication in IEEE Communications Surveys and Tutorial
Traffic classification and prediction, and fast uplink grant allocation for machine type communications via support vector machines and long short-term memory
Abstract. The current random access (RA) allocation techniques suffer from congestion and high signaling overhead while serving machine type communication (MTC) applications. Therefore, 3GPP has introduced the need to use fast uplink grant (FUG) allocation. This thesis proposes a novel FUG allocation based on support vector machine (SVM) and long short-term memory (LSTM). First, MTC devices are prioritized using SVM classifier. Second, LSTM architecture is used to predict activation time of each device. Both results are used to achieve an efficient resource scheduler in terms of the average latency and total throughput. Furthermore, a set of correction techniques is introduced to overcome the classification and prediction errors. The Coupled Markov Modulated Poisson Process (CMMPP) traffic model is applied to compare the proposed FUG allocation to other existing allocation techniques. In addition, an extended traffic model based CMMPP is used to evaluate the proposed algorithm in a more dense network. Our simulation results show the proposed model outperforms the existing RA allocation schemes by achieving the highest throughput and the lowest access delay when serving the target massive and critical MTC applications
D13.2 Techniques and performance analysis on energy- and bandwidth-efficient communications and networking
Deliverable D13.2 del projecte europeu NEWCOM#The report presents the status of the research work of the
various Joint Research Activities (JRA) in WP1.3 and the results
that were developed up to the second year of the project. For
each activity there is a description, an illustration of the
adherence to and relevance with the identified fundamental
open issues, a short presentation of the main results, and a
roadmap for the future joint research. In the Annex, for each
JRA, the main technical details on specific scientific activities
are described in detail.Peer ReviewedPostprint (published version
EC-CENTRIC: An Energy- and Context-Centric Perspective on IoT Systems and Protocol Design
The radio transceiver of an IoT device is often where most of the energy is consumed. For this reason, most research so far has focused on low power circuit and energy efficient physical layer designs, with the goal of reducing the average energy per information bit required for communication. While these efforts are valuable per se, their actual effectiveness can be partially neutralized by ill-designed network, processing and resource management solutions, which can become a primary factor of performance degradation, in terms of throughput, responsiveness and energy efficiency. The objective of this paper is to describe an energy-centric and context-aware optimization framework that accounts for the energy impact of the fundamental functionalities of an IoT system and that proceeds along three main technical thrusts: 1) balancing signal-dependent processing techniques (compression and feature extraction) and communication tasks; 2) jointly designing channel access and routing protocols to maximize the network lifetime; 3) providing self-adaptability to different operating conditions through the adoption of suitable learning architectures and of flexible/reconfigurable algorithms and protocols. After discussing this framework, we present some preliminary results that validate the effectiveness of our proposed line of action, and show how the use of adaptive signal processing and channel access techniques allows an IoT network to dynamically tune lifetime for signal distortion, according to the requirements dictated by the application
Infinite Factorial Finite State Machine for Blind Multiuser Channel Estimation
New communication standards need to deal with machine-to-machine
communications, in which users may start or stop transmitting at any time in an
asynchronous manner. Thus, the number of users is an unknown and time-varying
parameter that needs to be accurately estimated in order to properly recover
the symbols transmitted by all users in the system. In this paper, we address
the problem of joint channel parameter and data estimation in a multiuser
communication channel in which the number of transmitters is not known. For
that purpose, we develop the infinite factorial finite state machine model, a
Bayesian nonparametric model based on the Markov Indian buffet that allows for
an unbounded number of transmitters with arbitrary channel length. We propose
an inference algorithm that makes use of slice sampling and particle Gibbs with
ancestor sampling. Our approach is fully blind as it does not require a prior
channel estimation step, prior knowledge of the number of transmitters, or any
signaling information. Our experimental results, loosely based on the LTE
random access channel, show that the proposed approach can effectively recover
the data-generating process for a wide range of scenarios, with varying number
of transmitters, number of receivers, constellation order, channel length, and
signal-to-noise ratio.Comment: 15 pages, 15 figure
Optimized resource allocation techniques for critical machine-type communications in mixed LTE networks
To implement the revolutionary Internet of Things (IoT) paradigm, the evolution of
the communication networks to incorporate machine-type communications (MTC), in
addition to conventional human-type communications (HTC) has become inevitable.
Critical MTC, in contrast to massive MTC, represents that type of communications
that requires high network availability, ultra-high reliability, very low latency, and
high security, to enable what is known as mission-critical IoT. Due to the fact that
cellular networks are considered one of the most promising wireless technologies to
serve critical MTC, the International Telecommunication Union (ITU) targets critical
MTC as a major use case, along with the enhanced mobile broadband (eMBB)
and massive MTC, in the design of the upcoming generation of cellular networks.
Therefore, the Third Generation Partnership Project (3GPP) is evolving the current
Long-Term Evolution (LTE) standard to efficiently serve critical MTC to fulfill the
fifth-generation (5G) requirements using the evolved LTE (eLTE) in addition to the
new radio (NR). In this regard, 3GPP has introduced several enhancements in the
latest releases to support critical MTC in LTE, which is designed mainly for HTC.
However, guaranteeing stringent quality-of-service (QoS) for critical MTC while not
sacrificing that of conventional HTC is a challenging task from the radio resource
management perspective.
In this dissertation, we optimize the resource allocation and scheduling process
for critical MTC in mixed LTE networks in different operational and implementation
cases. We target maximizing the overall system utility while providing accurate guarantees for the QoS requirements of critical MTC, through a cross-layer design,
and that of HTC as well. For this purpose, we utilize advanced techniques from the
queueing theory and mathematical optimization. In addition, we adopt heuristic approaches
and matching-based techniques to design computationally-efficient resource
allocation schemes to be used in practice. In this regard, we analyze the proposed
methods from a practical perspective. Furthermore, we run extensive simulations to
evaluate the performance of the proposed techniques, validate the theoretical analysis,
and compare the performance with other schemes. The simulation results reveal a
close-to-optimal performance for the proposed algorithms while outperforming other
techniques from the literature
- …