9 research outputs found
On Improving Throughput of Multichannel ALOHA using Preamble-based Exploration
Machine-type communication (MTC) has been extensively studied to provide
connectivity for devices and sensors in the Internet-of-Thing (IoT). Thanks to
the sparse activity, random access, e.g., ALOHA, is employed for MTC to lower
signaling overhead. In this paper, we propose to adopt exploration for
multichannel ALOHA by transmitting preambles before transmitting data packets
in MTC, and show that the maximum throughput can be improved by a factor of 2 -
exp(-1) = 1.632, In the proposed approach, a base station (BS) needs to send
the feedback information to active users to inform the numbers of transmitted
preambles in multiple channels, which can be reliably estimated as in
compressive random access. A steady-state analysis is also performed with fast
retrial, which shows that the probability of packet collision becomes lower
and, as a result, the delay outage probability is greatly reduced for a lightly
loaded system. Simulation results also confirm the results from analysis.Comment: 10 pages, 7 figures, to appear in the Journal of Communications and
Networks. arXiv admin note: substantial text overlap with arXiv:2001.1111
Data-aided Sensing for Gaussian Process Regression in IoT Systems
In this paper, for efficient data collection with limited bandwidth,
data-aided sensing is applied to Gaussian process regression that is used to
learn data sets collected from sensors in Internet-of-Things systems. We focus
on the interpolation of sensors' measurements from a small number of
measurements uploaded by a fraction of sensors using Gaussian process
regression with data-aided sensing. Thanks to active sensor selection, it is
shown that Gaussian process regression with data-aided sensing can provide a
good estimate of a complete data set compared to that with random selection.
With multichannel ALOHA, data-aided sensing is generalized for distributed
selective uploading when sensors can have feedback of predictions of their
measurements so that each sensor can decide whether or not it uploads by
comparing its measurement with the predicted one. Numerical results show that
modified multichannel ALOHA with predictions can help improve the performance
of Gaussian process regression with data-aided sensing compared to conventional
multichannel ALOHA with equal uploading probability.Comment: 10 pages, 8 figures, to appear in IEEE IoT
Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions
The ever-increasing number of resource-constrained Machine-Type Communication
(MTC) devices is leading to the critical challenge of fulfilling diverse
communication requirements in dynamic and ultra-dense wireless environments.
Among different application scenarios that the upcoming 5G and beyond cellular
networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the
unique technical challenge of supporting a huge number of MTC devices, which is
the main focus of this paper. The related challenges include QoS provisioning,
handling highly dynamic and sporadic MTC traffic, huge signalling overhead and
Radio Access Network (RAN) congestion. In this regard, this paper aims to
identify and analyze the involved technical issues, to review recent advances,
to highlight potential solutions and to propose new research directions. First,
starting with an overview of mMTC features and QoS provisioning issues, we
present the key enablers for mMTC in cellular networks. Along with the
highlights on the inefficiency of the legacy Random Access (RA) procedure in
the mMTC scenario, we then present the key features and channel access
mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT.
Subsequently, we present a framework for the performance analysis of
transmission scheduling with the QoS support along with the issues involved in
short data packet transmission. Next, we provide a detailed overview of the
existing and emerging solutions towards addressing RAN congestion problem, and
then identify potential advantages, challenges and use cases for the
applications of emerging Machine Learning (ML) techniques in ultra-dense
cellular networks. Out of several ML techniques, we focus on the application of
low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss
some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future
publication in IEEE Communications Surveys and Tutorial
Enabling Technologies for Internet of Things: Licensed and Unlicensed Techniques
The Internet of Things (IoT) is a novel paradigm which is shaping the evolution of the future Internet. According to the vision underlying the IoT, the next step in increasing the ubiquity of the Internet, after connecting people anytime and everywhere, is to connect inanimate objects. By providing objects with embedded communication capabilities and a common addressing scheme, a highly distributed and ubiquitous network of seamlessly connected heterogeneous devices is formed, which can be fully integrated into the current
Internet and mobile networks, thus allowing for the development of new intelligent services available anytime, anywhere, by anyone and anything. Such a vision is also becoming known under the name of Machine-to-Machine (M2M), where the absence of human interaction in the system dynamics is further emphasized.
A massive number of wireless devices will have the ability to connect to the Internat through the IoT framework. With the accelerating pace of marketing such framework, the new wireless communications standards are studying/proposing solutions to incorporate the services needed for the IoT. However, with an estimate of 30 billion connected devices, a lot of challenges are facing the current wireless technology.
In our research, we address a variety of technology candidates for enabling such a massive framework. Mainly, we focus on the nderlay cognitive radio networks as the unlicensed candidate for IoT. On the other hand, we look into the current efforts done by the standardization bodies to accommodate the requirements of the IoT into the current cellular networks.
Specifically, we survey the new features and the new user equipment
categories added to the physical layer of the LTE-A.
In particular, we study the performance of a dual-hop cognitive radio network sharing the spectrum of a primary network in an underlay fashion. In particular, the cognitive network consists of a source, a destination, and multiple nodes employed as amplify-and-forward
relays. To improve the spectral efficiency, all relays are allowed to instantaneously transmit to the destination over the same frequency band. We present the optimal power allocation that maximizes the received signal-to-noise ratio (SNR) at the destination while satisfying the interference constrains of the primary network. The optimal power allocation is obtained through an eigen-solution of a channel-dependent matrix, and is shown to transform the transmission over the non-orthogonal relays into parallel channels. Furthermore, while the secondary destination is equipped with multiple antennas, we propose an antenna selection scheme to select the antenna with the highest SNR. To this end, we propose a clustering scheme to subgroup the available relays and use antenna selection at the receiver to extract the same diversity order. We show that random clustering causes the system to lose some of the available degrees of freedom. We provide analytical expression of the outage probability of the system for the random clustering and the proposed maximum-SNR clustering scheme with antenna selection. In addition, we adapt our design to increase the energy-efficiency of the overall network without significant loss in the data rate.
In the second part of this thesis, we will look into the current efforts done by the standardization bodies to accommodate the equirements of the IoT into the current cellular networks. Specifically, we present the new features and the new user equipment categories added to the physical layer of the LTE-A. We study some of the challenges facing the LTE-A when dealing with Machine Type communications (MTC). Specifically, the MTC Physical Downlink control channel (MPDCCH) is among the newly introduced features in the LTE-A that carries the downlink control information (DCI) for MTC devices. Correctly decoding the PDCCH, mainly depends on the channel estimation used to compensate for the channel errors during transmission, and the choice of such technique will affect both the complexity and the performance of the user equipment. We propose and assess the performance of a simple channel estimation technique depends in essence on the Least Squares (LS) estimates of the pilots signal and linear interpolations for low-Doppler channels associated with the
MTC application