1,320 research outputs found
Massive MIMO is a Reality -- What is Next? Five Promising Research Directions for Antenna Arrays
Massive MIMO (multiple-input multiple-output) is no longer a "wild" or
"promising" concept for future cellular networks - in 2018 it became a reality.
Base stations (BSs) with 64 fully digital transceiver chains were commercially
deployed in several countries, the key ingredients of Massive MIMO have made it
into the 5G standard, the signal processing methods required to achieve
unprecedented spectral efficiency have been developed, and the limitation due
to pilot contamination has been resolved. Even the development of fully digital
Massive MIMO arrays for mmWave frequencies - once viewed prohibitively
complicated and costly - is well underway. In a few years, Massive MIMO with
fully digital transceivers will be a mainstream feature at both sub-6 GHz and
mmWave frequencies. In this paper, we explain how the first chapter of the
Massive MIMO research saga has come to an end, while the story has just begun.
The coming wide-scale deployment of BSs with massive antenna arrays opens the
door to a brand new world where spatial processing capabilities are
omnipresent. In addition to mobile broadband services, the antennas can be used
for other communication applications, such as low-power machine-type or
ultra-reliable communications, as well as non-communication applications such
as radar, sensing and positioning. We outline five new Massive MIMO related
research directions: Extremely large aperture arrays, Holographic Massive MIMO,
Six-dimensional positioning, Large-scale MIMO radar, and Intelligent Massive
MIMO.Comment: 20 pages, 9 figures, submitted to Digital Signal Processin
Exploitation of Robust AoA Estimation and Low Overhead Beamforming in mmWave MIMO System
The limited spectral resource for wireless communications and dramatic proliferation of new applications and services directly necessitate the exploitation of millimeter wave (mmWave) communications. One critical enabling technology for mmWave communications is multi-input multi-output (MIMO), which enables other important physical layer techniques, specifically beamforming and antenna array based angle of arrival (AoA) estimation. Deployment of beamforming and AoA estimation has many challenges. Significant training and feedback overhead is required for beamforming, while conventional AoA estimation methods are not fast or robust. Thus, in this thesis, new algorithms are designed for low overhead beamforming, and robust AoA estimation with significantly reduced signal samples (snapshots).
The basic principle behind the proposed low overhead beamforming algorithm in time-division duplex (TDD) systems is to increase the beam serving period for the reduction of the feedback frequency. With the knowledge of location and speed of each candidate user equipment (UE), the codeword can be selected from the designed multi-pattern codebook, and the corresponding serving period can be estimated. The UEs with long serving period and low interference are selected and served simultaneously. This algorithm is proved to be effective in keeping the high data rate of conventional codebook-based beamforming, while the feedback required for codeword selection can be cut down.
A fast and robust AoA estimation algorithm is proposed as the basis of the low overhead beamforming for frequency-division duplex (FDD) systems. This algorithm utilizes uplink transmission signals to estimate the real-time AoA for angle-based beamforming in environments with different signal to noise ratios (SNR). Two-step neural network models are designed for AoA estimation. Within the angular group classified by the first model, the second model further estimates AoA with high accuracy. It is proved that these AoA estimation models work well with few signal snapshots, and are robust to applications in low SNR environments. The proposed AoA estimation algorithm based beamforming generates beams without using reference signals. Therefore, the low overhead beamforming can be achieved in FDD systems.
With the support of proposed algorithms, the mmWave resource can be leveraged to meet challenging requirements of new applications and services in wireless communication systems
PHYSICAL LAYER SECURITY IN THE 5G HETEROGENEOUS WIRELESS SYSTEM WITH IMPERFECT CSI
5G is expected to serve completely heterogeneous scenarios where devices with low or high software and hardware complexity will coexist. This entails a security challenge because low complexity devices such as IoT sensors must still have secrecy in their communications. This project proposes tools to maximize the secrecy rate in a scenario with legitimate users and eavesdroppers considering: i) the limitation that low complexity users have in computational power and ii) the eavesdroppers? unwillingness to provide their channel state information to the base station. The tools have been designed based on the physical layer security field and solve the resource allocation from two different approaches that are suitable in different use cases: i) using convex optimization theory or ii) using classification neural networks. Results show that, while the convex approach provides the best secrecy performance, the learning approach is a good alternative for dynamic scenarios or when wanting to save transmitting power
DeepTx: Deep Learning Beamforming with Channel Prediction
Machine learning algorithms have recently been considered for many tasks in
the field of wireless communications. Previously, we have proposed the use of a
deep fully convolutional neural network (CNN) for receiver processing and shown
it to provide considerable performance gains. In this study, we focus on
machine learning algorithms for the transmitter. In particular, we consider
beamforming and propose a CNN which, for a given uplink channel estimate as
input, outputs downlink channel information to be used for beamforming. The CNN
is trained in a supervised manner considering both uplink and downlink
transmissions with a loss function that is based on UE receiver performance.
The main task of the neural network is to predict the channel evolution between
uplink and downlink slots, but it can also learn to handle inefficiencies and
errors in the whole chain, including the actual beamforming phase. The provided
numerical experiments demonstrate the improved beamforming performance.Comment: 27 pages, this work has been submitted to the IEEE for possible
publication; v2: Fixed typo in author name, v3: a revisio
6G White Paper on Machine Learning in Wireless Communication Networks
The focus of this white paper is on machine learning (ML) in wireless
communications. 6G wireless communication networks will be the backbone of the
digital transformation of societies by providing ubiquitous, reliable, and
near-instant wireless connectivity for humans and machines. Recent advances in
ML research has led enable a wide range of novel technologies such as
self-driving vehicles and voice assistants. Such innovation is possible as a
result of the availability of advanced ML models, large datasets, and high
computational power. On the other hand, the ever-increasing demand for
connectivity will require a lot of innovation in 6G wireless networks, and ML
tools will play a major role in solving problems in the wireless domain. In
this paper, we provide an overview of the vision of how ML will impact the
wireless communication systems. We first give an overview of the ML methods
that have the highest potential to be used in wireless networks. Then, we
discuss the problems that can be solved by using ML in various layers of the
network such as the physical layer, medium access layer, and application layer.
Zero-touch optimization of wireless networks using ML is another interesting
aspect that is discussed in this paper. Finally, at the end of each section,
important research questions that the section aims to answer are presented
- …