5,918 research outputs found
Low-latency Ultra Reliable 5G Communications: Finite-Blocklength Bounds and Coding Schemes
Future autonomous systems require wireless connectivity able to support
extremely stringent requirements on both latency and reliability. In this
paper, we leverage recent developments in the field of finite-blocklength
information theory to illustrate how to optimally design wireless systems in
the presence of such stringent constraints. Focusing on a multi-antenna
Rayleigh block-fading channel, we obtain bounds on the maximum number of bits
that can be transmitted within given bandwidth, latency, and reliability
constraints, using an orthogonal frequency-division multiplexing system similar
to LTE. These bounds unveil the fundamental interplay between latency,
bandwidth, rate, and reliability. Furthermore, they suggest how to optimally
use the available spatial and frequency diversity. Finally, we use our bounds
to benchmark the performance of an actual coding scheme involving the
transmission of short packets
Enhanced Machine Learning Techniques for Early HARQ Feedback Prediction in 5G
We investigate Early Hybrid Automatic Repeat reQuest (E-HARQ) feedback
schemes enhanced by machine learning techniques as a path towards
ultra-reliable and low-latency communication (URLLC). To this end, we propose
machine learning methods to predict the outcome of the decoding process ahead
of the end of the transmission. We discuss different input features and
classification algorithms ranging from traditional methods to newly developed
supervised autoencoders. These methods are evaluated based on their prospects
of complying with the URLLC requirements of effective block error rates below
at small latency overheads. We provide realistic performance
estimates in a system model incorporating scheduling effects to demonstrate the
feasibility of E-HARQ across different signal-to-noise ratios, subcode lengths,
channel conditions and system loads, and show the benefit over regular HARQ and
existing E-HARQ schemes without machine learning.Comment: 14 pages, 15 figures; accepted versio
Single-Carrier Modulation versus OFDM for Millimeter-Wave Wireless MIMO
This paper presents results on the achievable spectral efficiency and on the
energy efficiency for a wireless multiple-input-multiple-output (MIMO) link
operating at millimeter wave frequencies (mmWave) in a typical 5G scenario. Two
different single-carrier modem schemes are considered, i.e., a traditional
modulation scheme with linear equalization at the receiver, and a
single-carrier modulation with cyclic prefix, frequency-domain equalization and
FFT-based processing at the receiver; these two schemes are compared with a
conventional MIMO-OFDM transceiver structure. Our analysis jointly takes into
account the peculiar characteristics of MIMO channels at mmWave frequencies,
the use of hybrid (analog-digital) pre-coding and post-coding beamformers, the
finite cardinality of the modulation structure, and the non-linear behavior of
the transmitter power amplifiers. Our results show that the best performance is
achieved by single-carrier modulation with time-domain equalization, which
exhibits the smallest loss due to the non-linear distortion, and whose
performance can be further improved by using advanced equalization schemes.
Results also confirm that performance gets severely degraded when the link
length exceeds 90-100 meters and the transmit power falls below 0 dBW.Comment: accepted for publication on IEEE Transactions on Communication
Capacity-Achieving Iterative LMMSE Detection for MIMO-NOMA Systems
This paper considers a iterative Linear Minimum Mean Square Error (LMMSE)
detection for the uplink Multiuser Multiple-Input and Multiple-Output (MU-MIMO)
systems with Non-Orthogonal Multiple Access (NOMA). The iterative LMMSE
detection greatly reduces the system computational complexity by departing the
overall processing into many low-complexity distributed calculations. However,
it is generally considered to be sub-optimal and achieves relatively poor
performance. In this paper, we firstly present the matching conditions and area
theorems for the iterative detection of the MIMO-NOMA systems. Based on the
proposed matching conditions and area theorems, the achievable rate region of
the iterative LMMSE detection is analysed. We prove that by properly design the
iterative LMMSE detection, it can achieve (i) the optimal sum capacity of
MU-MIMO systems, (ii) all the maximal extreme points in the capacity region of
MU-MIMO system, and (iii) the whole capacity region of two-user MIMO systems.Comment: 6pages, 5 figures, accepted by IEEE ICC 2016, 23-27 May 2016, Kuala
Lumpur, Malaysi
Spectral Efficiency of MIMO Millimeter-Wave Links with Single-Carrier Modulation for 5G Networks
Future wireless networks will extensively rely upon bandwidths centered on
carrier frequencies larger than 10GHz. Indeed, recent research has shown that,
despite the large path-loss, millimeter wave (mmWave) frequencies can be
successfully exploited to transmit very large data-rates over short distances
to slowly moving users. Due to hardware complexity and cost constraints,
single-carrier modulation schemes, as opposed to the popular multi-carrier
schemes, are being considered for use at mmWave frequencies. This paper
presents preliminary studies on the achievable spectral efficiency on a
wireless MIMO link operating at mmWave in a typical 5G scenario. Two different
single-carrier modem schemes are considered, i.e. a traditional modulation
scheme with linear equalization at the receiver, and a single-carrier
modulation with cyclic prefix, frequency-domain equalization and FFT-based
processing at the receiver. Our results show that the former achieves a larger
spectral efficiency than the latter. Results also confirm that the spectral
efficiency increases with the dimension of the antenna array, as well as that
performance gets severely degraded when the link length exceeds 100 meters and
the transmit power falls below 0dBW. Nonetheless, mmWave appear to be very
suited for providing very large data-rates over short distances.Comment: 8 pages, 8 figures, to appear in Proc. 20th International ITG
Workshop on Smart Antennas (WSA2016
5G Wireless Network Slicing for eMBB, URLLC, and mMTC: A Communication-Theoretic View
The grand objective of 5G wireless technology is to support three generic
services with vastly heterogeneous requirements: enhanced mobile broadband
(eMBB), massive machine-type communications (mMTC), and ultra-reliable
low-latency communications (URLLC). Service heterogeneity can be accommodated
by network slicing, through which each service is allocated resources to
provide performance guarantees and isolation from the other services. Slicing
of the Radio Access Network (RAN) is typically done by means of orthogonal
resource allocation among the services. This work studies the potential
advantages of allowing for non-orthogonal sharing of RAN resources in uplink
communications from a set of eMBB, mMTC and URLLC devices to a common base
station. The approach is referred to as Heterogeneous Non-Orthogonal Multiple
Access (H-NOMA), in contrast to the conventional NOMA techniques that involve
users with homogeneous requirements and hence can be investigated through a
standard multiple access channel. The study devises a communication-theoretic
model that accounts for the heterogeneous requirements and characteristics of
the three services. The concept of reliability diversity is introduced as a
design principle that leverages the different reliability requirements across
the services in order to ensure performance guarantees with non-orthogonal RAN
slicing. This study reveals that H-NOMA can lead, in some regimes, to
significant gains in terms of performance trade-offs among the three generic
services as compared to orthogonal slicing.Comment: Submitted to IEE
Massive MIMO for Internet of Things (IoT) Connectivity
Massive MIMO is considered to be one of the key technologies in the emerging
5G systems, but also a concept applicable to other wireless systems. Exploiting
the large number of degrees of freedom (DoFs) of massive MIMO essential for
achieving high spectral efficiency, high data rates and extreme spatial
multiplexing of densely distributed users. On the one hand, the benefits of
applying massive MIMO for broadband communication are well known and there has
been a large body of research on designing communication schemes to support
high rates. On the other hand, using massive MIMO for Internet-of-Things (IoT)
is still a developing topic, as IoT connectivity has requirements and
constraints that are significantly different from the broadband connections. In
this paper we investigate the applicability of massive MIMO to IoT
connectivity. Specifically, we treat the two generic types of IoT connections
envisioned in 5G: massive machine-type communication (mMTC) and ultra-reliable
low-latency communication (URLLC). This paper fills this important gap by
identifying the opportunities and challenges in exploiting massive MIMO for IoT
connectivity. We provide insights into the trade-offs that emerge when massive
MIMO is applied to mMTC or URLLC and present a number of suitable communication
schemes. The discussion continues to the questions of network slicing of the
wireless resources and the use of massive MIMO to simultaneously support IoT
connections with very heterogeneous requirements. The main conclusion is that
massive MIMO can bring benefits to the scenarios with IoT connectivity, but it
requires tight integration of the physical-layer techniques with the protocol
design.Comment: Submitted for publicatio
A Novel Millimeter-Wave Channel Simulator and Applications for 5G Wireless Communications
This paper presents details and applications of a novel channel simulation
software named NYUSIM, which can be used to generate realistic temporal and
spatial channel responses to support realistic physical- and link-layer
simulations and design for fifth-generation (5G) cellular communications.
NYUSIM is built upon the statistical spatial channel model for broadband
millimeter-wave (mmWave) wireless communication systems developed by
researchers at New York University (NYU). The simulator is applicable for a
wide range of carrier frequencies (500 MHz to 100 GHz), radio frequency (RF)
bandwidths (0 to 800 MHz), antenna beamwidths (7 to 360 degrees for azimuth and
7 to 45 degrees for elevation), and operating scenarios (urban microcell, urban
macrocell, and rural macrocell), and also incorporates multiple-input
multiple-output (MIMO) antenna arrays at the transmitter and receiver. This
paper also provides examples to demonstrate how to use NYUSIM for analyzing
MIMO channel conditions and spectral efficiencies, which show that NYUSIM is an
alternative and more realistic channel model compared to the 3rd Generation
Partnership Project (3GPP) and other channel models for mmWave bands.Comment: 7 pages, 8 figures, in 2017 IEEE International Conference on
Communications (ICC), Paris, May 201
- …