218 research outputs found
Compressive Sensing-Based Grant-Free Massive Access for 6G Massive Communication
The advent of the sixth-generation (6G) of wireless communications has given
rise to the necessity to connect vast quantities of heterogeneous wireless
devices, which requires advanced system capabilities far beyond existing
network architectures. In particular, such massive communication has been
recognized as a prime driver that can empower the 6G vision of future
ubiquitous connectivity, supporting Internet of Human-Machine-Things for which
massive access is critical. This paper surveys the most recent advances toward
massive access in both academic and industry communities, focusing primarily on
the promising compressive sensing-based grant-free massive access paradigm. We
first specify the limitations of existing random access schemes and reveal that
the practical implementation of massive communication relies on a dramatically
different random access paradigm from the current ones mainly designed for
human-centric communications. Then, a compressive sensing-based grant-free
massive access roadmap is presented, where the evolutions from single-antenna
to large-scale antenna array-based base stations, from single-station to
cooperative massive multiple-input multiple-output systems, and from unsourced
to sourced random access scenarios are detailed. Finally, we discuss the key
challenges and open issues to shed light on the potential future research
directions of grant-free massive access.Comment: Accepted by IEEE IoT Journa
Signal Processing and Learning for Next Generation Multiple Access in 6G
Wireless communication systems to date primarily rely on the orthogonality of
resources to facilitate the design and implementation, from user access to data
transmission. Emerging applications and scenarios in the sixth generation (6G)
wireless systems will require massive connectivity and transmission of a deluge
of data, which calls for more flexibility in the design concept that goes
beyond orthogonality. Furthermore, recent advances in signal processing and
learning have attracted considerable attention, as they provide promising
approaches to various complex and previously intractable problems of signal
processing in many fields. This article provides an overview of research
efforts to date in the field of signal processing and learning for
next-generation multiple access, with an emphasis on massive random access and
non-orthogonal multiple access. The promising interplay with new technologies
and the challenges in learning-based NGMA are discussed
On Investigations of Machine Learning and Deep Learning Techniques for MIMO Detection
This paper reviews in detail the various types of multiple input multiple output (MIMO) detector algorithms. The current MIMO detectors are not suitable for massive MIMO (mMIMO) scenarios where there are a large number of antennas. Their performance degrades with the increase in number of antennas in the MIMO system. For combatting the issues, machine learning (ML) and deep learning (DL) based detection algorithms are being researched and developed. An extensive survey of these detectors is provided in this paper, alongwith their advantages and challenges. The issues discussed have to be resolved before using them for final deployment
Channel Acquisition for HF Skywave Massive MIMO-OFDM Communications
In this paper, we investigate channel acquisition for high frequency (HF)
skywave massive multiple-input multiple-output (MIMO) communications with
orthogonal frequency division multiplexing (OFDM) modulation. We first
introduce the concept of triple beams (TBs) in the space-frequency-time (SFT)
domain and establish a TB based channel model using sampled triple steering
vectors. With the established channel model, we then investigate the optimal
channel estimation and pilot design for pilot segments. Specifically, we find
the conditions that allow pilot reuse among multiple user terminals (UTs),
which significantly reduces pilot overhead. Moreover, we propose a channel
prediction method for data segments based on the estimated TB domain channel.
To reduce the complexity, we are able to formulate the channel estimation as a
sparse signal recovery problem due to the channel sparsity in the TB domain and
then obtain the channel by the proposed constrained Bethe free energy
minimization (CBFEM) based channel estimation algorithm, which can be
implemented with low complexity by exploiting the structure of the TB matrix
together with the chirp z-transform (CZT). Simulation results demonstrate the
superior performance of the proposed channel acquisition approach.Comment: 30 pages, 4 figure
Sensing User's Activity, Channel, and Location with Near-Field Extra-Large-Scale MIMO
This paper proposes a grant-free massive access scheme based on the
millimeter wave (mmWave) extra-large-scale multiple-input multiple-output
(XL-MIMO) to support massive Internet-of-Things (IoT) devices with low latency,
high data rate, and high localization accuracy in the upcoming sixth-generation
(6G) networks. The XL-MIMO consists of multiple antenna subarrays that are
widely spaced over the service area to ensure line-of-sight (LoS)
transmissions. First, we establish the XL-MIMO-based massive access model
considering the near-field spatial non-stationary (SNS) property. Then, by
exploiting the block sparsity of subarrays and the SNS property, we propose a
structured block orthogonal matching pursuit algorithm for efficient active
user detection (AUD) and channel estimation (CE). Furthermore, different
sensing matrices are applied in different pilot subcarriers for exploiting the
diversity gains. Additionally, a multi-subarray collaborative localization
algorithm is designed for localization. In particular, the angle of arrival
(AoA) and time difference of arrival (TDoA) of the LoS links between active
users and related subarrays are extracted from the estimated XL-MIMO channels,
and then the coordinates of active users are acquired by jointly utilizing the
AoAs and TDoAs. Simulation results show that the proposed algorithms outperform
existing algorithms in terms of AUD and CE performance and can achieve
centimeter-level localization accuracy.Comment: Submitted to IEEE Transactions on Communications, Major revision.
Codes will be open to all on https://gaozhen16.github.io/ soo
Review of Recent Trends
This work was partially supported by the European Regional Development Fund (FEDER), through the Regional Operational Programme of Centre (CENTRO 2020) of the Portugal 2020 framework, through projects SOCA (CENTRO-01-0145-FEDER-000010) and ORCIP (CENTRO-01-0145-FEDER-022141). Fernando P. Guiomar acknowledges a fellowship from “la Caixa” Foundation (ID100010434), code LCF/BQ/PR20/11770015. Houda Harkat acknowledges the financial support of the Programmatic Financing of the CTS R&D Unit (UIDP/00066/2020).MIMO-OFDM is a key technology and a strong candidate for 5G telecommunication systems. In the literature, there is no convenient survey study that rounds up all the necessary points to be investigated concerning such systems. The current deeper review paper inspects and interprets the state of the art and addresses several research axes related to MIMO-OFDM systems. Two topics have received special attention: MIMO waveforms and MIMO-OFDM channel estimation. The existing MIMO hardware and software innovations, in addition to the MIMO-OFDM equalization techniques, are discussed concisely. In the literature, only a few authors have discussed the MIMO channel estimation and modeling problems for a variety of MIMO systems. However, to the best of our knowledge, there has been until now no review paper specifically discussing the recent works concerning channel estimation and the equalization process for MIMO-OFDM systems. Hence, the current work focuses on analyzing the recently used algorithms in the field, which could be a rich reference for researchers. Moreover, some research perspectives are identified.publishersversionpublishe
- …