2,170 research outputs found
Towards a Realistic Assessment of Multiple Antenna HCNs: Residual Additive Transceiver Hardware Impairments and Channel Aging
Given the critical dependence of broadcast channels by the accuracy of
channel state information at the transmitter (CSIT), we develop a general
downlink model with zero-forcing (ZF) precoding, applied in realistic
heterogeneous cellular systems with multiple antenna base stations (BSs).
Specifically, we take into consideration imperfect CSIT due to pilot
contamination, channel aging due to users relative movement, and unavoidable
residual additive transceiver hardware impairments (RATHIs). Assuming that the
BSs are Poisson distributed, the main contributions focus on the derivations of
the upper bound of the coverage probability and the achievable user rate for
this general model. We show that both the coverage probability and the user
rate are dependent on the imperfect CSIT and RATHIs. More concretely, we
quantify the resultant performance loss of the network due to these effects. We
depict that the uplink RATHIs have equal impact, but the downlink transmit BS
distortion has a greater impact than the receive hardware impairment of the
user. Thus, the transmit BS hardware should be of better quality than user's
receive hardware. Furthermore, we characterise both the coverage probability
and user rate in terms of the time variation of the channel. It is shown that
both of them decrease with increasing user mobility, but after a specific value
of the normalised Doppler shift, they increase again. Actually, the time
variation, following the Jakes autocorrelation function, mirrors this effect on
coverage probability and user rate. Finally, we consider space division
multiple access (SDMA), single user beamforming (SU-BF), and baseline
single-input single-output (SISO) transmission. A comparison among these
schemes reveals that the coverage by means of SU-BF outperforms SDMA in terms
of coverage.Comment: accepted in IEEE TV
Learning and Management for Internet-of-Things: Accounting for Adaptivity and Scalability
Internet-of-Things (IoT) envisions an intelligent infrastructure of networked
smart devices offering task-specific monitoring and control services. The
unique features of IoT include extreme heterogeneity, massive number of
devices, and unpredictable dynamics partially due to human interaction. These
call for foundational innovations in network design and management. Ideally, it
should allow efficient adaptation to changing environments, and low-cost
implementation scalable to massive number of devices, subject to stringent
latency constraints. To this end, the overarching goal of this paper is to
outline a unified framework for online learning and management policies in IoT
through joint advances in communication, networking, learning, and
optimization. From the network architecture vantage point, the unified
framework leverages a promising fog architecture that enables smart devices to
have proximity access to cloud functionalities at the network edge, along the
cloud-to-things continuum. From the algorithmic perspective, key innovations
target online approaches adaptive to different degrees of nonstationarity in
IoT dynamics, and their scalable model-free implementation under limited
feedback that motivates blind or bandit approaches. The proposed framework
aspires to offer a stepping stone that leads to systematic designs and analysis
of task-specific learning and management schemes for IoT, along with a host of
new research directions to build on.Comment: Submitted on June 15 to Proceeding of IEEE Special Issue on Adaptive
and Scalable Communication Network
User Association in 5G Networks: A Survey and an Outlook
26 pages; accepted to appear in IEEE Communications Surveys and Tutorial
- …