745 research outputs found
Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions
The ever-increasing number of resource-constrained Machine-Type Communication
(MTC) devices is leading to the critical challenge of fulfilling diverse
communication requirements in dynamic and ultra-dense wireless environments.
Among different application scenarios that the upcoming 5G and beyond cellular
networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the
unique technical challenge of supporting a huge number of MTC devices, which is
the main focus of this paper. The related challenges include QoS provisioning,
handling highly dynamic and sporadic MTC traffic, huge signalling overhead and
Radio Access Network (RAN) congestion. In this regard, this paper aims to
identify and analyze the involved technical issues, to review recent advances,
to highlight potential solutions and to propose new research directions. First,
starting with an overview of mMTC features and QoS provisioning issues, we
present the key enablers for mMTC in cellular networks. Along with the
highlights on the inefficiency of the legacy Random Access (RA) procedure in
the mMTC scenario, we then present the key features and channel access
mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT.
Subsequently, we present a framework for the performance analysis of
transmission scheduling with the QoS support along with the issues involved in
short data packet transmission. Next, we provide a detailed overview of the
existing and emerging solutions towards addressing RAN congestion problem, and
then identify potential advantages, challenges and use cases for the
applications of emerging Machine Learning (ML) techniques in ultra-dense
cellular networks. Out of several ML techniques, we focus on the application of
low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss
some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future
publication in IEEE Communications Surveys and Tutorial
Location-Enabled IoT (LE-IoT): A Survey of Positioning Techniques, Error Sources, and Mitigation
The Internet of Things (IoT) has started to empower the future of many
industrial and mass-market applications. Localization techniques are becoming
key to add location context to IoT data without human perception and
intervention. Meanwhile, the newly-emerged Low-Power Wide-Area Network (LPWAN)
technologies have advantages such as long-range, low power consumption, low
cost, massive connections, and the capability for communication in both indoor
and outdoor areas. These features make LPWAN signals strong candidates for
mass-market localization applications. However, there are various error sources
that have limited localization performance by using such IoT signals. This
paper reviews the IoT localization system through the following sequence: IoT
localization system review -- localization data sources -- localization
algorithms -- localization error sources and mitigation -- localization
performance evaluation. Compared to the related surveys, this paper has a more
comprehensive and state-of-the-art review on IoT localization methods, an
original review on IoT localization error sources and mitigation, an original
review on IoT localization performance evaluation, and a more comprehensive
review of IoT localization applications, opportunities, and challenges. Thus,
this survey provides comprehensive guidance for peers who are interested in
enabling localization ability in the existing IoT systems, using IoT systems
for localization, or integrating IoT signals with the existing localization
sensors
Preprint: Using RF-DNA Fingerprints To Classify OFDM Transmitters Under Rayleigh Fading Conditions
The Internet of Things (IoT) is a collection of Internet connected devices
capable of interacting with the physical world and computer systems. It is
estimated that the IoT will consist of approximately fifty billion devices by
the year 2020. In addition to the sheer numbers, the need for IoT security is
exacerbated by the fact that many of the edge devices employ weak to no
encryption of the communication link. It has been estimated that almost 70% of
IoT devices use no form of encryption. Previous research has suggested the use
of Specific Emitter Identification (SEI), a physical layer technique, as a
means of augmenting bit-level security mechanism such as encryption. The work
presented here integrates a Nelder-Mead based approach for estimating the
Rayleigh fading channel coefficients prior to the SEI approach known as RF-DNA
fingerprinting. The performance of this estimator is assessed for degrading
signal-to-noise ratio and compared with least square and minimum mean squared
error channel estimators. Additionally, this work presents classification
results using RF-DNA fingerprints that were extracted from received signals
that have undergone Rayleigh fading channel correction using Minimum Mean
Squared Error (MMSE) equalization. This work also performs radio discrimination
using RF-DNA fingerprints generated from the normalized magnitude-squared and
phase response of Gabor coefficients as well as two classifiers. Discrimination
of four 802.11a Wi-Fi radios achieves an average percent correct classification
of 90% or better for signal-to-noise ratios of 18 and 21 dB or greater using a
Rayleigh fading channel comprised of two and five paths, respectively.Comment: 13 pages, 14 total figures/images, Currently under review by the IEEE
Transactions on Information Forensics and Securit
Compressive Sensing-Based Grant-Free Massive Access for 6G Massive Communication
The advent of the sixth-generation (6G) of wireless communications has given
rise to the necessity to connect vast quantities of heterogeneous wireless
devices, which requires advanced system capabilities far beyond existing
network architectures. In particular, such massive communication has been
recognized as a prime driver that can empower the 6G vision of future
ubiquitous connectivity, supporting Internet of Human-Machine-Things for which
massive access is critical. This paper surveys the most recent advances toward
massive access in both academic and industry communities, focusing primarily on
the promising compressive sensing-based grant-free massive access paradigm. We
first specify the limitations of existing random access schemes and reveal that
the practical implementation of massive communication relies on a dramatically
different random access paradigm from the current ones mainly designed for
human-centric communications. Then, a compressive sensing-based grant-free
massive access roadmap is presented, where the evolutions from single-antenna
to large-scale antenna array-based base stations, from single-station to
cooperative massive multiple-input multiple-output systems, and from unsourced
to sourced random access scenarios are detailed. Finally, we discuss the key
challenges and open issues to shed light on the potential future research
directions of grant-free massive access.Comment: Accepted by IEEE IoT Journa
Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks
Future wireless networks have a substantial potential in terms of supporting
a broad range of complex compelling applications both in military and civilian
fields, where the users are able to enjoy high-rate, low-latency, low-cost and
reliable information services. Achieving this ambitious goal requires new radio
techniques for adaptive learning and intelligent decision making because of the
complex heterogeneous nature of the network structures and wireless services.
Machine learning (ML) algorithms have great success in supporting big data
analytics, efficient parameter estimation and interactive decision making.
Hence, in this article, we review the thirty-year history of ML by elaborating
on supervised learning, unsupervised learning, reinforcement learning and deep
learning. Furthermore, we investigate their employment in the compelling
applications of wireless networks, including heterogeneous networks (HetNets),
cognitive radios (CR), Internet of things (IoT), machine to machine networks
(M2M), and so on. This article aims for assisting the readers in clarifying the
motivation and methodology of the various ML algorithms, so as to invoke them
for hitherto unexplored services as well as scenarios of future wireless
networks.Comment: 46 pages, 22 fig
Cellular, Wide-Area, and Non-Terrestrial IoT: A Survey on 5G Advances and the Road Towards 6G
The next wave of wireless technologies is proliferating in connecting things
among themselves as well as to humans. In the era of the Internet of things
(IoT), billions of sensors, machines, vehicles, drones, and robots will be
connected, making the world around us smarter. The IoT will encompass devices
that must wirelessly communicate a diverse set of data gathered from the
environment for myriad new applications. The ultimate goal is to extract
insights from this data and develop solutions that improve quality of life and
generate new revenue. Providing large-scale, long-lasting, reliable, and near
real-time connectivity is the major challenge in enabling a smart connected
world. This paper provides a comprehensive survey on existing and emerging
communication solutions for serving IoT applications in the context of
cellular, wide-area, as well as non-terrestrial networks. Specifically,
wireless technology enhancements for providing IoT access in fifth-generation
(5G) and beyond cellular networks, and communication networks over the
unlicensed spectrum are presented. Aligned with the main key performance
indicators of 5G and beyond 5G networks, we investigate solutions and standards
that enable energy efficiency, reliability, low latency, and scalability
(connection density) of current and future IoT networks. The solutions include
grant-free access and channel coding for short-packet communications,
non-orthogonal multiple access, and on-device intelligence. Further, a vision
of new paradigm shifts in communication networks in the 2030s is provided, and
the integration of the associated new technologies like artificial
intelligence, non-terrestrial networks, and new spectra is elaborated. Finally,
future research directions toward beyond 5G IoT networks are pointed out.Comment: Submitted for review to IEEE CS&
- …