38 research outputs found

    Channel Estimation with Dynamic Metasurface Antennas via Model-Based Learning

    Full text link
    Dynamic Metasurface Antenna (DMA) is a cutting-edge antenna technology offering scalable and sustainable solutions for large antenna arrays. The effectiveness of DMAs stems from their inherent configurable analog signal processing capabilities, which facilitate cost-limited implementations. However, when DMAs are used in multiple input multiple output (MIMO) communication systems, they pose challenges in channel estimation due to their analog compression. In this paper, we propose two model-based learning methods to overcome this challenge. Our approach starts by casting channel estimation as a compressed sensing problem. Here, the sensing matrix is formed using a random DMA weighting matrix combined with a spatial gridding dictionary. We then employ the learned iterative shrinkage and thresholding algorithm (LISTA) to recover the sparse channel parameters. LISTA unfolds the iterative shrinkage and thresholding algorithm into a neural network and trains the neural network into a highly efficient channel estimator fitting with the previous channel. As the sensing matrix is crucial to the accuracy of LISTA recovery, we introduce another data-aided method, LISTA-sensing matrix optimization (LISTA-SMO), to jointly optimize the sensing matrix. LISTA-SMO takes LISTA as a backbone and embeds the sensing matrix optimization layers in LISTA's neural network, allowing for the optimization of the sensing matrix along with the training of LISTA. Furthermore, we propose a self-supervised learning technique to tackle the difficulty of acquiring noise-free data. Our numerical results demonstrate that LISTA outperforms traditional sparse recovery methods regarding channel estimation accuracy and efficiency. Besides, LISTA-SMO achieves better channel accuracy than LISTA, demonstrating the effectiveness in optimizing the sensing matrix

    Topology Control, Scheduling, and Spectrum Sensing in 5G Networks

    Get PDF
    The proliferation of intelligent wireless devices is remarkable. To address phenomenal traffic growth, a key objective of next-generation wireless networks such as 5G is to provide significantly larger bandwidth. To this end, the millimeter wave (mmWave) band (20 GHz -300 GHz) has been identified as a promising candidate for 5G and WiFi networks to support user data rates of multi-gigabits per second. However, path loss at mmWave is significantly higher than today\u27s cellular bands. Fortunately, this higher path loss can be compensated through the antenna beamforming technique-a transmitter focuses a signal towards a specific direction to achieve high signal gain at the receiver. In the beamforming mmWave network, two fundamental challenges are network topology control and user association and scheduling. This dissertation proposes solutions to address these two challenges. We also study a spectrum sensing scheme which is important for spectrum sharing in next-generation wireless networks. Due to beamforming, the network topology control in mmWave networks, i.e., how to determine the number of beams for each base station and the beam coverage, is a great challenge. We present a novel framework to solve this problem, termed Beamforming Oriented tOpology coNtrol (BOON). The objective is to reduce total downlink transmit power of base stations in order to provide coverage of all users with a minimum quality of service. BOON smartly groups nearby user equipment into clusters to dramatically reduce interference between beams and base stations so that we can significantly reduce transmit power from the base station. We have found that on average BOON uses only 10%, 32%, and 25% transmit power of three state-of-the-art schemes in the literature. Another fundamental problem in the mmWave network is the user association and traffic scheduling, i.e., associating users to base stations, and scheduling transmission of user traffic over time slots. This is because base station has a limited power budget and users have very diverse traffic, and also require some minimum quality of service. User association is challenging because it generally does not rely on the user distance to surrounding base stations but depends on if a user is covered by a beam. We develop a novel framework for user association and scheduling in multi-base station mmWave networks, termed the clustering Based dOwnlink user assOciation Scheduling, beamforming with power allocaTion (BOOST). The objective is to reduce the downlink network transmission time of all users\u27 traffic. On average, BOOST reduces the transmission time by 37%, 30%, and 26% compared with the three state-of-the-art user scheduling schemes in the literature. At last, we present a wavelet transform based spectrum sensing scheme that can simultaneously sense multiple subbands, even without knowing how the subbands are divided, i.e., their boundaries. It can adaptively detect all active subband signals and, thus, discover the residual spectrum that can be used by unlicensed devices

    Design and Realization of Fully-digital Microwave and Mm-wave Multi-beam Arrays with FPGA/RF-SOC Signal Processing

    Get PDF
    There has been a constant increase in data-traffic and device-connections in mobile wireless communications, which led the fifth generation (5G) implementations to exploit mm-wave bands at 24/28 GHz. The next-generation wireless access point (6G and beyond) will need to adopt large-scale transceiver arrays with a combination of multi-input-multi-output (MIMO) theory and fully digital multi-beam beamforming. The resulting high gain array factors will overcome the high path losses at mm-wave bands, and the simultaneous multi-beams will exploit the multi-directional channels due to multi-path effects and improve the signal-to-noise ratio. Such access points will be based on electronic systems which heavily depend on the integration of RF electronics with digital signal processing performed in Field programmable gate arrays (FPGA)/ RF-system-on-chip (SoC). This dissertation is directed towards the investigation and realization of fully-digital phased arrays that can produce wideband simultaneous multi-beams with FPGA or RF-SoC digital back-ends. The first proposed approach is a spatial bandpass (SBP) IIR filter-based beamformer, and is based on the concepts of space-time network resonance. A 2.4 GHz, 16-element array receiver, has been built for real-time experimental verification of this approach. The second and third approaches are respectively based on Discrete Fourier Transform (DFT) theory, and a lens plus focal planar array theory. Lens based approach is essentially an analog model of DFT. These two approaches are verified for a 28 GHz 800 MHz mm-wave implementation with RF-SoC as the digital back-end. It has been shown that for all proposed multibeam beamformer implementations, the measured beams are well aligned with those of the simulated. The proposed approaches differ in terms of their architectures, hardware complexity and costs, which will be discussed as this dissertation opens up. This dissertation also presents an application of multi-beam approaches for RF directional sensing applications to explore white spaces within the spatio-temporal spectral regions. A real-time directional sensing system is proposed to capture the white spaces within the 2.4 GHz Wi-Fi band. Further, this dissertation investigates the effect of electro-magnetic (EM) mutual coupling in antenna arrays on the real-time performance of fully-digital transceivers. Different algorithms are proposed to uncouple the mutual coupling in digital domain. The first one is based on finding the MC transfer function from the measured S-parameters of the antenna array and employing it in a Frost FIR filter in the beamforming backend. The second proposed method uses fast algorithms to realize the inverse of mutual coupling matrix via tridiagonal Toeplitz matrices having sparse factors. A 5.8 GHz 32-element array and 1-7 GHz 7-element tightly coupled dipole array (TCDA) have been employed to demonstrate the proof-of-concept of these algorithms
    corecore