475 research outputs found

    Machine Learning Solutions for Context Information-aware Beam Management in Millimeter Wave Communications

    Get PDF

    A Deep Learning Approach to Location- and Orientation-aided 3D Beam Selection for mmWave Communications

    Get PDF
    Position-aided beam selection methods have been shown to be an effective approach to achieve high beamforming gain while limiting the overhead and latency of initial access in millimeter wave (mmWave) communications. Most research in the area, however, has focused on vehicular applications, where the orientation of the user terminal (UT) is mostly fixed at each position of the environment. This paper proposes a location- and orientation-based beam selection method to enable context information (CI)-based beam alignment in applications where the UT can take arbitrary orientation at each location. We propose three different network structures, with different amounts of trainable parameters that can be used with different training dataset sizes. A professional 3-dimensional ray tracing tool is used to generate datasets for an IEEE standard indoor scenario. Numerical results show the proposed networks outperform a CI-aided benchmark such as the generalized inverse fingerprinting (GIFP) method as well as hierarchical beam search as a non-CI-based approach. Moreover, compared to the GIFP method, the proposed deep learning-based beam selection shows higher robustness to different line-of-sight blockage probability in the training and test datasets and lower sensitivity to inaccuracies in the position and orientation information.Comment: 30 pages, 12 figure. This article was submitted to IEEE Transactions on Wireless Communications on Oct 11 202

    Twenty-Five Years of Advances in Beamforming: From Convex and Nonconvex Optimization to Learning Techniques

    Full text link
    Beamforming is a signal processing technique to steer, shape, and focus an electromagnetic wave using an array of sensors toward a desired direction. It has been used in several engineering applications such as radar, sonar, acoustics, astronomy, seismology, medical imaging, and communications. With the advances in multi-antenna technologies largely for radar and communications, there has been a great interest on beamformer design mostly relying on convex/nonconvex optimization. Recently, machine learning is being leveraged for obtaining attractive solutions to more complex beamforming problems. This article captures the evolution of beamforming in the last twenty-five years from convex-to-nonconvex optimization and optimization-to-learning approaches. It provides a glimpse of this important signal processing technique into a variety of transmit-receive architectures, propagation zones, paths, and conventional/emerging applications

    A Generalized Framework on Beamformer Design and CSI Acquisition for Single-Carrier Massive MIMO Systems in Millimeter Wave Channels

    Get PDF
    In this paper, we establish a general framework on the reduced dimensional channel state information (CSI) estimation and pre-beamformer design for frequency-selective massive multiple-input multiple-output MIMO systems employing single-carrier (SC) modulation in time division duplex (TDD) mode by exploiting the joint angle-delay domain channel sparsity in millimeter (mm) wave frequencies. First, based on a generic subspace projection taking the joint angle-delay power profile and user-grouping into account, the reduced rank minimum mean square error (RR-MMSE) instantaneous CSI estimator is derived for spatially correlated wideband MIMO channels. Second, the statistical pre-beamformer design is considered for frequency-selective SC massive MIMO channels. We examine the dimension reduction problem and subspace (beamspace) construction on which the RR-MMSE estimation can be realized as accurately as possible. Finally, a spatio-temporal domain correlator type reduced rank channel estimator, as an approximation of the RR-MMSE estimate, is obtained by carrying out least square (LS) estimation in a proper reduced dimensional beamspace. It is observed that the proposed techniques show remarkable robustness to the pilot interference (or contamination) with a significant reduction in pilot overhead

    HBF MU-MIMO with Interference-Aware Beam Pair Link Allocation for Beyond-5G mm-Wave Networks

    Full text link
    Hybrid beamforming (HBF) multi-user multiple-input multiple-output (MU-MIMO) is a key technology for unlocking the directional millimeter-wave (mm-wave) nature for spatial multiplexing beyond current codebook-based 5G-NR networks. In order to suppress co-scheduled users' interference, HBF MU-MIMO is predicated on having sufficient radio frequency chains and accurate channel state information (CSI), which can otherwise lead to performance losses due to imperfect interference cancellation. In this work, we propose IABA, a 5G-NR standard-compliant beam pair link (BPL) allocation scheme for mitigating spatial interference in practical HBF MU-MIMO networks. IABA solves the network sum throughput optimization via either a distributed or a centralized BPL allocation using dedicated CSI reference signals for candidate BPL monitoring. We present a comprehensive study of practical multi-cell mm-wave networks and demonstrate that HBF MU-MIMO without interference-aware BPL allocation experiences strong residual interference which limits the achievable network performance. Our results show that IABA offers significant performance gains over the default interference-agnostic 5G-NR BPL allocation, and even allows HBF MU-MIMO to outperform the fully digital MU-MIMO baseline, by facilitating allocation of secondary BPLs other than the strongest BPL found during initial access. We further demonstrate the scalability of IABA with increased gNB antennas and densification for beyond-5G mm-wave networks.Comment: 13 pages, 11 figures. This work has been submitted to IEEE for possible publication (copyright may be transferred without notice, after which this version may no longer be accessible

    Exploitation of Robust AoA Estimation and Low Overhead Beamforming in mmWave MIMO System

    Get PDF
    The limited spectral resource for wireless communications and dramatic proliferation of new applications and services directly necessitate the exploitation of millimeter wave (mmWave) communications. One critical enabling technology for mmWave communications is multi-input multi-output (MIMO), which enables other important physical layer techniques, specifically beamforming and antenna array based angle of arrival (AoA) estimation. Deployment of beamforming and AoA estimation has many challenges. Significant training and feedback overhead is required for beamforming, while conventional AoA estimation methods are not fast or robust. Thus, in this thesis, new algorithms are designed for low overhead beamforming, and robust AoA estimation with significantly reduced signal samples (snapshots). The basic principle behind the proposed low overhead beamforming algorithm in time-division duplex (TDD) systems is to increase the beam serving period for the reduction of the feedback frequency. With the knowledge of location and speed of each candidate user equipment (UE), the codeword can be selected from the designed multi-pattern codebook, and the corresponding serving period can be estimated. The UEs with long serving period and low interference are selected and served simultaneously. This algorithm is proved to be effective in keeping the high data rate of conventional codebook-based beamforming, while the feedback required for codeword selection can be cut down. A fast and robust AoA estimation algorithm is proposed as the basis of the low overhead beamforming for frequency-division duplex (FDD) systems. This algorithm utilizes uplink transmission signals to estimate the real-time AoA for angle-based beamforming in environments with different signal to noise ratios (SNR). Two-step neural network models are designed for AoA estimation. Within the angular group classified by the first model, the second model further estimates AoA with high accuracy. It is proved that these AoA estimation models work well with few signal snapshots, and are robust to applications in low SNR environments. The proposed AoA estimation algorithm based beamforming generates beams without using reference signals. Therefore, the low overhead beamforming can be achieved in FDD systems. With the support of proposed algorithms, the mmWave resource can be leveraged to meet challenging requirements of new applications and services in wireless communication systems

    Hybrid MIMO Architectures for Millimeter Wave Communications: Phase Shifters or Switches?

    Full text link
    Hybrid analog/digital MIMO architectures were recently proposed as an alternative for fully-digitalprecoding in millimeter wave (mmWave) wireless communication systems. This is motivated by the possible reduction in the number of RF chains and analog-to-digital converters. In these architectures, the analog processing network is usually based on variable phase shifters. In this paper, we propose hybrid architectures based on switching networks to reduce the complexity and the power consumption of the structures based on phase shifters. We define a power consumption model and use it to evaluate the energy efficiency of both structures. To estimate the complete MIMO channel, we propose an open loop compressive channel estimation technique which is independent of the hardware used in the analog processing stage. We analyze the performance of the new estimation algorithm for hybrid architectures based on phase shifters and switches. Using the estimated, we develop two algorithms for the design of the hybrid combiner based on switches and analyze the achieved spectral efficiency. Finally, we study the trade-offs between power consumption, hardware complexity, and spectral efficiency for hybrid architectures based on phase shifting networks and switching networks. Numerical results show that architectures based on switches obtain equal or better channel estimation performance to that obtained using phase shifters, while reducing hardware complexity and power consumption. For equal power consumption, all the hybrid architectures provide similar spectral efficiencies.Comment: Submitted to IEEE Acces
    • …
    corecore