287 research outputs found
Terahertz Communications and Sensing for 6G and Beyond: A Comprehensive View
The next-generation wireless technologies, commonly referred to as the sixth
generation (6G), are envisioned to support extreme communications capacity and
in particular disruption in the network sensing capabilities. The terahertz
(THz) band is one potential enabler for those due to the enormous unused
frequency bands and the high spatial resolution enabled by both short
wavelengths and bandwidths. Different from earlier surveys, this paper presents
a comprehensive treatment and technology survey on THz communications and
sensing in terms of the advantages, applications, propagation characterization,
channel modeling, measurement campaigns, antennas, transceiver devices,
beamforming, networking, the integration of communications and sensing, and
experimental testbeds. Starting from the motivation and use cases, we survey
the development and historical perspective of THz communications and sensing
with the anticipated 6G requirements. We explore the radio propagation, channel
modeling, and measurements for THz band. The transceiver requirements,
architectures, technological challenges, and approaches together with means to
compensate for the high propagation losses by appropriate antenna and
beamforming solutions. We survey also several system technologies required by
or beneficial for THz systems. The synergistic design of sensing and
communications is explored with depth. Practical trials, demonstrations, and
experiments are also summarized. The paper gives a holistic view of the current
state of the art and highlights the issues and challenges that are open for
further research towards 6G.Comment: 55 pages, 10 figures, 8 tables, submitted to IEEE Communications
Surveys & Tutorial
Emerging Approaches for THz Array Imaging: A Tutorial Review and Software Tool
Accelerated by the increasing attention drawn by 5G, 6G, and Internet of
Things applications, communication and sensing technologies have rapidly
evolved from millimeter-wave (mmWave) to terahertz (THz) in recent years.
Enabled by significant advancements in electromagnetic (EM) hardware, mmWave
and THz frequency regimes spanning 30 GHz to 300 GHz and 300 GHz to 3000 GHz,
respectively, can be employed for a host of applications. The main feature of
THz systems is high-bandwidth transmission, enabling ultra-high-resolution
imaging and high-throughput communications; however, challenges in both the
hardware and algorithmic arenas remain for the ubiquitous adoption of THz
technology. Spectra comprising mmWave and THz frequencies are well-suited for
synthetic aperture radar (SAR) imaging at sub-millimeter resolutions for a wide
spectrum of tasks like material characterization and nondestructive testing
(NDT). This article provides a tutorial review of systems and algorithms for
THz SAR in the near-field with an emphasis on emerging algorithms that combine
signal processing and machine learning techniques. As part of this study, an
overview of classical and data-driven THz SAR algorithms is provided, focusing
on object detection for security applications and SAR image super-resolution.
We also discuss relevant issues, challenges, and future research directions for
emerging algorithms and THz SAR, including standardization of system and
algorithm benchmarking, adoption of state-of-the-art deep learning techniques,
signal processing-optimized machine learning, and hybrid data-driven signal
processing algorithms...Comment: Submitted to Proceedings of IEE
Non-Terrestrial Networks in the 6G Era: Challenges and Opportunities
Many organizations recognize non-terrestrial networks (NTNs) as a key
component to provide cost-effective and high-capacity connectivity in future
6th generation (6G) wireless networks. Despite this premise, there are still
many questions to be answered for proper network design, including those
associated to latency and coverage constraints. In this paper, after reviewing
research activities on NTNs, we present the characteristics and enabling
technologies of NTNs in the 6G landscape and shed light on the challenges in
the field that are still open for future research. As a case study, we evaluate
the performance of an NTN scenario in which satellites use millimeter wave
(mmWave) frequencies to provide access connectivity to on-the-ground mobile
terminals as a function of different networking configurations.Comment: 8 pages, 4 figures, 2 tables, submitted for publication to the IEE
Applying Deep Learning for Phase-Array Antenna Design
Master of Engineering (Electrical Engineering), 2021Hybrid beamforming (HBF) can provide rapid data transmission rates while reducing the complexity and cost of massive multiple-input multiple-output (MIMO) systems. However, channel state information (CSI) is imperfect in realistic downlink channels, introducing challenges to hybrid beamforming (HBF) design. For HBF designs, we had a hard time finding the proper labels. If we use the optimized output based on the traditional algorithm as the label, the neural network can only be trained to approximate the traditional algorithm, but not better than the traditional algorithm. This thesis proposes a hybrid beamforming neural network based on unsupervised deep learning (USDNN) to prevent the labeling overhead of supervised learning and improve the achievable sum rate based on imperfect CSI. Compared with the traditional HBF method, the unsupervised learning-based method can avoid the labeling overhead as well as obtain better performance than the traditional algorithm. The network consists of 5 dense layers, 4 batch normalization (BN) layers and 5 activation functions. After training, the optimized beamformer can be obtained, and the optimized beamforming vector can be directly output. The simulation results show that our proposed method is 74% better than manifold optimization (MO) and 120% better than orthogonal match pursuit (OMP) systems. Furthermore, our proposed USDNN can achieve near-optimal performance
Contextual Beamforming: Exploiting Location and AI for Enhanced Wireless Telecommunication Performance
The pervasive nature of wireless telecommunication has made it the foundation
for mainstream technologies like automation, smart vehicles, virtual reality,
and unmanned aerial vehicles. As these technologies experience widespread
adoption in our daily lives, ensuring the reliable performance of cellular
networks in mobile scenarios has become a paramount challenge. Beamforming, an
integral component of modern mobile networks, enables spatial selectivity and
improves network quality. However, many beamforming techniques are iterative,
introducing unwanted latency to the system. In recent times, there has been a
growing interest in leveraging mobile users' location information to expedite
beamforming processes. This paper explores the concept of contextual
beamforming, discussing its advantages, disadvantages and implications.
Notably, the study presents an impressive 53% improvement in signal-to-noise
ratio (SNR) by implementing the adaptive beamforming (MRT) algorithm compared
to scenarios without beamforming. It further elucidates how MRT contributes to
contextual beamforming. The importance of localization in implementing
contextual beamforming is also examined. Additionally, the paper delves into
the use of artificial intelligence schemes, including machine learning and deep
learning, in implementing contextual beamforming techniques that leverage user
location information. Based on the comprehensive review, the results suggest
that the combination of MRT and Zero forcing (ZF) techniques, alongside deep
neural networks (DNN) employing Bayesian Optimization (BO), represents the most
promising approach for contextual beamforming. Furthermore, the study discusses
the future potential of programmable switches, such as Tofino, in enabling
location-aware beamforming
Optimized Switching Between Sensing and Communication for mmWave MU-MISO Systems
In this paper, we propose a scheme optimizing the per-user channel sensing duration in millimeter-wave (mmWave) multi-user multiple-input single-output (MU-MISO) systems. For each user, the BS predicts the effective rate to be achieved after pilot transmission. Then, the channel sensing duration of each user is optimized by ending the pilot transmission when the predicted rate is lower than the current rate. The robust regularized zero-forcing (RRZF) precoder and equal power allocation (EPA) are adopted to transmit sensing pilots and data. Numerical results show that the more severe the interference among users, the longer channel sensing duration is required. Moreover, the proposed scheme results in a higher sum rate compared to benchmark schemes
- …