792 research outputs found

    Contextual Beamforming: Exploiting Location and AI for Enhanced Wireless Telecommunication Performance

    Full text link
    The pervasive nature of wireless telecommunication has made it the foundation for mainstream technologies like automation, smart vehicles, virtual reality, and unmanned aerial vehicles. As these technologies experience widespread adoption in our daily lives, ensuring the reliable performance of cellular networks in mobile scenarios has become a paramount challenge. Beamforming, an integral component of modern mobile networks, enables spatial selectivity and improves network quality. However, many beamforming techniques are iterative, introducing unwanted latency to the system. In recent times, there has been a growing interest in leveraging mobile users' location information to expedite beamforming processes. This paper explores the concept of contextual beamforming, discussing its advantages, disadvantages and implications. Notably, the study presents an impressive 53% improvement in signal-to-noise ratio (SNR) by implementing the adaptive beamforming (MRT) algorithm compared to scenarios without beamforming. It further elucidates how MRT contributes to contextual beamforming. The importance of localization in implementing contextual beamforming is also examined. Additionally, the paper delves into the use of artificial intelligence schemes, including machine learning and deep learning, in implementing contextual beamforming techniques that leverage user location information. Based on the comprehensive review, the results suggest that the combination of MRT and Zero forcing (ZF) techniques, alongside deep neural networks (DNN) employing Bayesian Optimization (BO), represents the most promising approach for contextual beamforming. Furthermore, the study discusses the future potential of programmable switches, such as Tofino, in enabling location-aware beamforming

    Multi-layer Utilization of Beamforming in Millimeter Wave MIMO Systems

    Get PDF
    mmWave frequencies ranging between (30-300GHz) have been considered the perfect solution to the scarcity of bandwidth in the traditional sub-6GHz band and to the ever increasing demand of many emerging applications in today\u27s era. 5G and beyond standards are all considering the mmWave as an essential part of there networks. Beamforming is one of the most important enabling technologies for the mmWave to compensate for the huge propagation lose of these frequencies compared to the sub-6GHz frequencies and to ensure better spatial and spectral utilization of the mmWave channel space. In this work, we tried to develop different techniques to improve the performance of the systems that use mmWave. In the physical layer, we suggested several hybrid beamforming architectures that both are relatively simple and spectrally efficient by achieving fully digital like spectral efficiency (bits/sec/Hz). For the mobility management, we derived the expected degradation that can affect the performance of a special type of beamforming that is called the Random Beamforming (RBF) and optimized the tunable parameters for such systems when working in different environments. Finally, in the networking layer, we first studied the effect of using mmWave frequencies on the routing performance comparing to the performance achieved when using sub-6 GHz frequencies. Then we developed a novel opportunistic routing protocol for Mobile Ad-Hoc Networks (MANET) that uses a modified version of the Random Beamforming (RBF) to achieve better end to end performance and to reduce the overall delay in delivering data from transmitting nodes to the intended receiving nodes. From all these designs and studies, we conclude that mmWave frequencies and their enabling technologies (i.e. Beamforming, massive MIMO, ...etc.) are indeed the future of wireless communicatons in a high demanding world of Internet of Things (IoT), Augmented Reality (AR), Virtual Reality (VR), and self driving cars

    Deep Learning Aided Parametric Channel Covariance Matrix Estimation for Millimeter Wave Hybrid Massive MIMO

    Full text link
    Millimeter-wave (mmWave) channels, which occupy frequency ranges much higher than those being used in previous wireless communications systems, are utilized to meet the increased throughput requirements that come with 5G communications. The high levels of attenuation experienced by electromagnetic waves in these frequencies causes MIMO channels to have high spatial correlation. To attain desirable error performances, systems require knowledge about the channel correlations. In this thesis, a deep neural network aided method is proposed for the parametric estimation of the channel covariance matrix (CCM), which contains information regarding the channel correlations. When compared to some methods found in the literature, the proposed method yields satisfactory performance in terms of both computational complexity and channel estimation errors.Comment: M.Sc. Thesis, published at: https://open.metu.edu.tr/handle/11511/9319
    • …
    corecore