147 research outputs found
Impact of Transceiver Impairments on the Capacity of Dual-Hop Relay Massive MIMO Systems
Despite the deleterious effect of hardware impairments on communication
systems, most prior works have not investigated their impact on widely used
relay systems. Most importantly, the application of inexpensive transceivers,
being prone to hardware impairments, is the most cost-efficient way for the
implementation of massive multiple-input multiple-output (MIMO) systems.
Consequently, the direction of this paper is towards the investigation of the
impact of hardware impairments on MIMO relay networks with large number of
antennas. Specifically, we obtain the general expression for the ergodic
capacity of dual-hop (DH) amplify-and-forward (AF) relay systems. Next, given
the advantages of the free probability (FP) theory with comparison to other
known techniques in the area of large random matrix theory, we pursue a large
limit analysis in terms of number of antennas and users by shedding light to
the behavior of relay systems inflicted by hardware impairments.Comment: 6 pages, 4 figures, accepted in IEEE Global Communications Conference
(GLOBECOM 2015) - Workshop on Massive MIMO: From theory to practice, 201
Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions
The ever-increasing number of resource-constrained Machine-Type Communication
(MTC) devices is leading to the critical challenge of fulfilling diverse
communication requirements in dynamic and ultra-dense wireless environments.
Among different application scenarios that the upcoming 5G and beyond cellular
networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the
unique technical challenge of supporting a huge number of MTC devices, which is
the main focus of this paper. The related challenges include QoS provisioning,
handling highly dynamic and sporadic MTC traffic, huge signalling overhead and
Radio Access Network (RAN) congestion. In this regard, this paper aims to
identify and analyze the involved technical issues, to review recent advances,
to highlight potential solutions and to propose new research directions. First,
starting with an overview of mMTC features and QoS provisioning issues, we
present the key enablers for mMTC in cellular networks. Along with the
highlights on the inefficiency of the legacy Random Access (RA) procedure in
the mMTC scenario, we then present the key features and channel access
mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT.
Subsequently, we present a framework for the performance analysis of
transmission scheduling with the QoS support along with the issues involved in
short data packet transmission. Next, we provide a detailed overview of the
existing and emerging solutions towards addressing RAN congestion problem, and
then identify potential advantages, challenges and use cases for the
applications of emerging Machine Learning (ML) techniques in ultra-dense
cellular networks. Out of several ML techniques, we focus on the application of
low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss
some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future
publication in IEEE Communications Surveys and Tutorial
Sensing-Throughput Tradeoff for Interweave Cognitive Radio System: A Deployment-Centric Viewpoint
Secondary access to the licensed spectrum is viable only if interference is
avoided at the primary system. In this regard, different paradigms have been
conceptualized in the existing literature. Of these, Interweave Systems (ISs)
that employ spectrum sensing have been widely investigated. Baseline models
investigated in the literature characterize the performance of IS in terms of a
sensing-throughput tradeoff, however, this characterization assumes the
knowledge of the involved channels at the secondary transmitter, which is
unavailable in practice. Motivated by this fact, we establish a novel approach
that incorporates channel estimation in the system model, and consequently
investigate the impact of imperfect channel estimation on the performance of
the IS. More particularly, the variation induced in the detection probability
affects the detector's performance at the secondary transmitter, which may
result in severe interference at the primary users. In this view, we propose to
employ average and outage constraints on the detection probability, in order to
capture the performance of the IS. Our analysis reveals that with an
appropriate choice of the estimation time determined by the proposed model, the
degradation in performance of the IS can be effectively controlled, and
subsequently the achievable secondary throughput can be significantly enhanced.Comment: 13 pages, 10 figures, Accepted to be published in IEEE Transactions
on Wireless Communication
Multiple Access Techniques for Next Generation Wireless: Recent Advances and Future Perspectives
The advances in multiple access techniques has been one of the key drivers in moving from one cellular generation to another. Starting from the first generation, several multiple access techniques have been explored in different generations and various emerging multiplexing/multiple access techniques are being investigated for the next generation of cellular networks. In this context, this paper first provides a detailed review on the existing Space Division Multiple Access (SDMA) related works. Subsequently, it highlights the main features and the drawbacks of various existing and emerging multiplexing/multiple access techniques. Finally, we propose a novel concept of clustered orthogonal signature division multiple access for the next generation of cellular networks. The proposed concept envisions to employ joint antenna coding in order to enhance the orthogonality of SDMA beams with the objective of enhancing the spectral efficiency of future cellular networks
Live Data Analytics with Collaborative Edge and Cloud Processing in Wireless IoT Network
Recently, big data analytics has received important attention in a variety of application domains
including business, finance, space science, healthcare, telecommunication and Internet of Things (IoT). Among these areas, IoT is considered as an important platform in bringing people, processes, data and things/objects together in order to enhance the quality of our everyday lives. However, the key challenges are how to effectively extract useful features from the massive amount of heterogeneous data generated by resource-constrained IoT devices in order to provide real-time information and feedback to the endusers, and how to utilize this data-aware intelligence in enhancing the performance of wireless IoT networks. Although there are parallel advances in cloud computing and edge computing for addressing some issues in data analytics, they have their own benefits and limitations. The convergence of these two computing paradigms, i.e., massive virtually shared pool of computing and storage resources from the cloud and real-time data processing by edge computing, could effectively enable live data analytics in wireless IoT networks.
In this regard, we propose a novel framework for coordinated processing between edge and cloud computing/processing by integrating advantages from both the platforms. The proposed framework can exploit the network-wide knowledge and historical information available at the cloud center to guide edge computing units towards satisfying various performance requirements of heterogeneous wireless IoT networks. Starting with the main features, key enablers and the challenges of big data analytics, we provide various synergies and distinctions between cloud and edge processing. More importantly, we identify and describe the potential key enablers for the proposed edge-cloud collaborative framework, the associated key challenges and some
interesting future research directions
Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions
The ever-increasing number of resource-constrained
Machine-Type Communication (MTC) devices is leading to the
critical challenge of fulfilling diverse communication requirements
in dynamic and ultra-dense wireless environments. Among
different application scenarios that the upcoming 5G and beyond
cellular networks are expected to support, such as enhanced Mobile
Broadband (eMBB), massive Machine Type Communications
(mMTC) and Ultra-Reliable and Low Latency Communications
(URLLC), the mMTC brings the unique technical challenge of
supporting a huge number of MTC devices in cellular networks,
which is the main focus of this paper. The related challenges
include Quality of Service (QoS) provisioning, handling highly
dynamic and sporadic MTC traffic, huge signalling overhead and
Radio Access Network (RAN) congestion. In this regard, this
paper aims to identify and analyze the involved technical issues,
to review recent advances, to highlight potential solutions and to
propose new research directions. First, starting with an overview
of mMTC features and QoS provisioning issues, we present
the key enablers for mMTC in cellular networks. Along with
the highlights on the inefficiency of the legacy Random Access
(RA) procedure in the mMTC scenario, we then present the key
features and channel access mechanisms in the emerging cellular
IoT standards, namely, LTE-M and Narrowband IoT (NB-IoT).
Subsequently, we present a framework for the performance
analysis of transmission scheduling with the QoS support along
with the issues involved in short data packet transmission. Next,
we provide a detailed overview of the existing and emerging
solutions towards addressing RAN congestion problem, and then
identify potential advantages, challenges and use cases for the
applications of emerging Machine Learning (ML) techniques in
ultra-dense cellular networks. Out of several ML techniques, we
focus on the application of low-complexity Q-learning approach
in the mMTC scenario along with the recent advances towards
enhancing its learning performance and convergence. Finally,
we discuss some open research challenges and promising future
research directions
Collaborative Distributed Q-Learning for RACH Congestion Minimization in Cellular IoT Networks
Due to infrequent and massive concurrent access requests from the ever-increasing number of machine-type communication (MTC) devices, the existing contention-based random access (RA) protocols, such as slotted ALOHA, suffer from the severe problem of random access channel (RACH) congestion in emerging cellular IoT networks. To address this issue, we propose a novel collaborative distributed Q-learning mechanism for the resource-constrained MTC devices in order to enable them to find unique RA slots for their transmissions so that the number of possible collisions can be significantly reduced. In contrast to the independent Q-learning scheme, the proposed approach utilizes the congestion level of RA slots as the global cost during the learning process and thus can notably lower the learning time for the low-end MTC devices. Our results show that the proposed learning scheme can significantly minimize the RACH congestion in cellular IoT networks
Towards Tactile Internet in Beyond 5G Era: Recent Advances, Current Issues and Future Directions
Tactile Internet (TI) is envisioned to create a paradigm shift from the content-oriented
communications to steer/control-based communications by enabling real-time transmission of haptic information (i.e., touch, actuation, motion, vibration, surface texture) over Internet in addition to the conventional audiovisual and data traffics. This emerging TI technology, also considered as the next evolution phase of Internet of Things (IoT), is expected to create numerous opportunities for technology markets in a wide variety of applications ranging from teleoperation systems and Augmented/Virtual Reality (AR/VR) to automotive safety and eHealthcare towards addressing the complex problems of human society. However, the realization of TI over wireless media in the upcoming Fifth Generation (5G) and beyond networks creates various non-conventional communication challenges and stringent requirements
in terms of ultra-low latency, ultra-high reliability, high data-rate connectivity, resource allocation, multiple access and quality-latency-rate tradeoff. To this end, this paper aims to provide a holistic view on wireless TI along with a thorough review of the existing state-of-the-art, to identify and analyze the involved technical issues, to highlight potential solutions and to propose future research directions. First, starting with the vision of TI and recent advances and a review of related survey/overview articles, we present a generalized framework for wireless TI in the Beyond 5G Era including a TI architecture, the main technical requirements, the key application areas and potential enabling technologies. Subsequently, we provide a comprehensive review of the existing TI works by broadly categorizing them into three main paradigms; namely, haptic communications, wireless AR/VR, and autonomous, intelligent and cooperative mobility systems. Next, potential enabling technologies across physical/Medium Access Control (MAC) and network layers are identified and discussed in detail. Also, security and privacy issues of TI applications are discussed
along with some promising enablers. Finally, we present some open research challenges and recommend promising future research directions
Emerging Edge Computing Technologies for Distributed Internet of Things (IoT) Systems
The ever-increasing growth in the number of connected smart devices and
various Internet of Things (IoT) verticals is leading to a crucial challenge of
handling massive amount of raw data generated from distributed IoT systems and
providing real-time feedback to the end-users. Although existing
cloud-computing paradigm has an enormous amount of virtual computing power and
storage capacity, it is not suitable for latency-sensitive applications and
distributed systems due to the involved latency and its centralized mode of
operation. To this end, edge/fog computing has recently emerged as the next
generation of computing systems for extending cloud-computing functions to the
edges of the network. Despite several benefits of edge computing such as
geo-distribution, mobility support and location awareness, various
communication and computing related challenges need to be addressed in
realizing edge computing technologies for future IoT systems. In this regard,
this paper provides a holistic view on the current issues and effective
solutions by classifying the emerging technologies in regard to the joint
coordination of radio and computing resources, system optimization and
intelligent resource management. Furthermore, an optimization framework for
edge-IoT systems is proposed to enhance various performance metrics such as
throughput, delay, resource utilization and energy consumption. Finally, a
Machine Learning (ML) based case study is presented along with some numerical
results to illustrate the significance of edge computing.Comment: 16 pages, 4 figures, 2 tables, submitted to IEEE Wireless
Communications Magazin
- …