354 research outputs found

    Reliable indoor optical wireless communication in the presence of fixed and random blockers

    Get PDF
    The advanced innovation of smartphones has led to the exponential growth of internet users which is expected to reach 71% of the global population by the end of 2027. This in turn has given rise to the demand for wireless data and internet devices that is capable of providing energy-efficient, reliable data transmission and high-speed wireless data services. Light-fidelity (LiFi), known as one of the optical wireless communication (OWC) technology is envisioned as a promising solution to accommodate these demands. However, the indoor LiFi channel is highly environment-dependent which can be influenced by several crucial factors (e.g., presence of people, furniture, random users' device orientation and the limited field of view (FOV) of optical receivers) which may contribute to the blockage of the line-of-sight (LOS) link. In this thesis, it is investigated whether deep learning (DL) techniques can effectively learn the distinct features of the indoor LiFi environment in order to provide superior performance compared to the conventional channel estimation techniques (e.g., minimum mean square error (MMSE) and least squares (LS)). This performance can be seen particularly when access to real-time channel state information (CSI) is restricted and is achieved with the cost of collecting large and meaningful data to train the DL neural networks and the training time which was conducted offline. Two DL-based schemes are designed for signal detection and resource allocation where it is shown that the proposed methods were able to offer close performance to the optimal conventional schemes and demonstrate substantial gain in terms of bit-error ratio (BER) and throughput especially in a more realistic or complex indoor environment. Performance analysis of LiFi networks under the influence of fixed and random blockers is essential and efficient solutions capable of diminishing the blockage effect is required. In this thesis, a CSI acquisition technique for a reconfigurable intelligent surface (RIS)-aided LiFi network is proposed to significantly reduce the dimension of the decision variables required for RIS beamforming. Furthermore, it is shown that several RIS attributes such as shape, size, height and distribution play important roles in increasing the network performance. Finally, the performance analysis for an RIS-aided realistic indoor LiFi network are presented. The proposed RIS configuration shows outstanding performances in reducing the network outage probability under the effect of blockages, random device orientation, limited receiver's FOV, furniture and user behavior. Establishing a LOS link that achieves uninterrupted wireless connectivity in a realistic indoor environment can be challenging. In this thesis, an analysis of link blockage is presented for an indoor LiFi system considering fixed and random blockers. In particular, novel analytical framework of the coverage probability for a single source and multi-source are derived. Using the proposed analytical framework, link blockages of the indoor LiFi network are carefully investigated and it is shown that the incorporation of multiple sources and RIS can significantly reduce the LOS coverage blockage probability in indoor LiFi systems

    Analysis and Design of Non-Orthogonal Multiple Access (NOMA) Techniques for Next Generation Wireless Communication Systems

    Get PDF
    The current surge in wireless connectivity, anticipated to amplify significantly in future wireless technologies, brings a new wave of users. Given the impracticality of an endlessly expanding bandwidth, there’s a pressing need for communication techniques that efficiently serve this burgeoning user base with limited resources. Multiple Access (MA) techniques, notably Orthogonal Multiple Access (OMA), have long addressed bandwidth constraints. However, with escalating user numbers, OMA’s orthogonality becomes limiting for emerging wireless technologies. Non-Orthogonal Multiple Access (NOMA), employing superposition coding, serves more users within the same bandwidth as OMA by allocating different power levels to users whose signals can then be detected using the gap between them, thus offering superior spectral efficiency and massive connectivity. This thesis examines the integration of NOMA techniques with cooperative relaying, EXtrinsic Information Transfer (EXIT) chart analysis, and deep learning for enhancing 6G and beyond communication systems. The adopted methodology aims to optimize the systems’ performance, spanning from bit-error rate (BER) versus signal to noise ratio (SNR) to overall system efficiency and data rates. The primary focus of this thesis is the investigation of the integration of NOMA with cooperative relaying, EXIT chart analysis, and deep learning techniques. In the cooperative relaying context, NOMA notably improved diversity gains, thereby proving the superiority of combining NOMA with cooperative relaying over just NOMA. With EXIT chart analysis, NOMA achieved low BER at mid-range SNR as well as achieved optimal user fairness in the power allocation stage. Additionally, employing a trained neural network enhanced signal detection for NOMA in the deep learning scenario, thereby producing a simpler signal detection for NOMA which addresses NOMAs’ complex receiver problem

    Beam scanning by liquid-crystal biasing in a modified SIW structure

    Get PDF
    A fixed-frequency beam-scanning 1D antenna based on Liquid Crystals (LCs) is designed for application in 2D scanning with lateral alignment. The 2D array environment imposes full decoupling of adjacent 1D antennas, which often conflicts with the LC requirement of DC biasing: the proposed design accommodates both. The LC medium is placed inside a Substrate Integrated Waveguide (SIW) modified to work as a Groove Gap Waveguide, with radiating slots etched on the upper broad wall, that radiates as a Leaky-Wave Antenna (LWA). This allows effective application of the DC bias voltage needed for tuning the LCs. At the same time, the RF field remains laterally confined, enabling the possibility to lay several antennas in parallel and achieve 2D beam scanning. The design is validated by simulation employing the actual properties of a commercial LC medium

    Integration of hybrid networks, AI, Ultra Massive-MIMO, THz frequency, and FBMC modulation toward 6g requirements : A Review

    Get PDF
    The fifth-generation (5G) wireless communications have been deployed in many countries with the following features: wireless networks at 20 Gbps as peak data rate, a latency of 1-ms, reliability of 99.999%, maximum mobility of 500 km/h, a bandwidth of 1-GHz, and a capacity of 106 up to Mbps/m2. Nonetheless, the rapid growth of applications, such as extended/virtual reality (XR/VR), online gaming, telemedicine, cloud computing, smart cities, the Internet of Everything (IoE), and others, demand lower latency, higher data rates, ubiquitous coverage, and better reliability. These higher requirements are the main problems that have challenged 5G while concurrently encouraging researchers and practitioners to introduce viable solutions. In this review paper, the sixth-generation (6G) technology could solve the 5G limitations, achieve higher requirements, and support future applications. The integration of multiple access techniques, terahertz (THz), visible light communications (VLC), ultra-massive multiple-input multiple-output ( μm -MIMO), hybrid networks, cell-free massive MIMO, and artificial intelligence (AI)/machine learning (ML) have been proposed for 6G. The main contributions of this paper are a comprehensive review of the 6G vision, KPIs (key performance indicators), and advanced potential technologies proposed with operation principles. Besides, this paper reviewed multiple access and modulation techniques, concentrating on Filter-Bank Multicarrier (FBMC) as a potential technology for 6G. This paper ends by discussing potential applications with challenges and lessons identified from prior studies to pave the path for future research

    Security and Privacy for Modern Wireless Communication Systems

    Get PDF
    The aim of this reprint focuses on the latest protocol research, software/hardware development and implementation, and system architecture design in addressing emerging security and privacy issues for modern wireless communication networks. Relevant topics include, but are not limited to, the following: deep-learning-based security and privacy design; covert communications; information-theoretical foundations for advanced security and privacy techniques; lightweight cryptography for power constrained networks; physical layer key generation; prototypes and testbeds for security and privacy solutions; encryption and decryption algorithm for low-latency constrained networks; security protocols for modern wireless communication networks; network intrusion detection; physical layer design with security consideration; anonymity in data transmission; vulnerabilities in security and privacy in modern wireless communication networks; challenges of security and privacy in node–edge–cloud computation; security and privacy design for low-power wide-area IoT networks; security and privacy design for vehicle networks; security and privacy design for underwater communications networks

    Distributed Implementation of eXtended Reality Technologies over 5G Networks

    Get PDF
    Mención Internacional en el título de doctorThe revolution of Extended Reality (XR) has already started and is rapidly expanding as technology advances. Announcements such as Meta’s Metaverse have boosted the general interest in XR technologies, producing novel use cases. With the advent of the fifth generation of cellular networks (5G), XR technologies are expected to improve significantly by offloading heavy computational processes from the XR Head Mounted Display (HMD) to an edge server. XR offloading can rapidly boost XR technologies by considerably reducing the burden on the XR hardware, while improving the overall user experience by enabling smoother graphics and more realistic interactions. Overall, the combination of XR and 5G has the potential to revolutionize the way we interact with technology and experience the world around us. However, XR offloading is a complex task that requires state-of-the-art tools and solutions, as well as an advanced wireless network that can meet the demanding throughput, latency, and reliability requirements of XR. The definition of these requirements strongly depends on the use case and particular XR offloading implementations. Therefore, it is crucial to perform a thorough Key Performance Indicators (KPIs) analysis to ensure a successful design of any XR offloading solution. Additionally, distributed XR implementations can be intrincated systems with multiple processes running on different devices or virtual instances. All these agents must be well-handled and synchronized to achieve XR real-time requirements and ensure the expected user experience, guaranteeing a low processing overhead. XR offloading requires a carefully designed architecture which complies with the required KPIs while efficiently synchronizing and handling multiple heterogeneous devices. Offloading XR has become an essential use case for 5G and beyond 5G technologies. However, testing distributed XR implementations requires access to advanced 5G deployments that are often unavailable to most XR application developers. Conversely, the development of 5G technologies requires constant feedback from potential applications and use cases. Unfortunately, most 5G providers, engineers, or researchers lack access to cutting-edge XR hardware or applications, which can hinder the fast implementation and improvement of 5G’s most advanced features. Both technology fields require ongoing input and continuous development from each other to fully realize their potential. As a result, XR and 5G researchers and developers must have access to the necessary tools and knowledge to ensure the rapid and satisfactory development of both technology fields. In this thesis, we focus on these challenges providing knowledge, tools and solutiond towards the implementation of advanced offloading technologies, opening the door to more immersive, comfortable and accessible XR technologies. Our contributions to the field of XR offloading include a detailed study and description of the necessary network throughput and latency KPIs for XR offloading, an architecture for low latency XR offloading and our full end to end XR offloading implementation ready for a commercial XR HMD. Besides, we also present a set of tools which can facilitate the joint development of 5G networks and XR offloading technologies: our 5G RAN real-time emulator and a multi-scenario XR IP traffic dataset. Firstly, in this thesis, we thoroughly examine and explain the KPIs that are required to achieve the expected Quality of Experience (QoE) and enhanced immersiveness in XR offloading solutions. Our analysis focuses on individual XR algorithms, rather than potential use cases. Additionally, we provide an initial description of feasible 5G deployments that could fulfill some of the proposed KPIs for different offloading scenarios. We also present our low latency muti-modal XR offloading architecture, which has already been tested on a commercial XR device and advanced 5G deployments, such as millimeter-wave (mmW) technologies. Besides, we describe our full endto- end complex XR offloading system which relies on our offloading architecture to provide low latency communication between a commercial XR device and a server running a Machine Learning (ML) algorithm. To the best of our knowledge, this is one of the first successful XR offloading implementations for complex ML algorithms in a commercial device. With the goal of providing XR developers and researchers access to complex 5G deployments and accelerating the development of future XR technologies, we present FikoRE, our 5G RAN real-time emulator. FikoRE has been specifically designed not only to model the network with sufficient accuracy but also to support the emulation of a massive number of users and actual IP throughput. As FikoRE can handle actual IP traffic above 1 Gbps, it can directly be used to test distributed XR solutions. As we describe in the thesis, its emulation capabilities make FikoRE a potential candidate to become a reference testbed for distributed XR developers and researchers. Finally, we used our XR offloading tools to generate an XR IP traffic dataset which can accelerate the development of 5G technologies by providing a straightforward manner for testing novel 5G solutions using realistic XR data. This dataset is generated for two relevant XR offloading scenarios: split rendering, in which the rendering step is moved to an edge server, and heavy ML algorithm offloading. Besides, we derive the corresponding IP traffic models from the captured data, which can be used to generate realistic XR IP traffic. We also present the validation experiments performed on the derived models and their results.This work has received funding from the European Union (EU) Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie ETN TeamUp5G, grant agreement No. 813391.Programa de Doctorado en Multimedia y Comunicaciones por la Universidad Carlos III de Madrid y la Universidad Rey Juan CarlosPresidente: Narciso García Santos.- Secretario: Fernando Díaz de María.- Vocal: Aryan Kaushi

    Machine Learning Empowered Reconfigurable Intelligent Surfaces

    Get PDF
    Reconfigurable intelligent surfaces (RISs) or known as intelligent reflecting surfaces (IRSs) have emerged as potential auxiliary equipment for future wireless networks, which attracts extensive research interest in their characteristics, applications, and potential. RIS is a panel surface equipped with a number of reflective elements, which can artificially modify the propagation environment of the electrogenic signals. Specifically, RISs have the ability to precisely adjust the propagation direction, amplitude, and phase-shift of the signals, providing users with a set of cascaded channels in addition to direct channels, and thereby improving the communication performances for users. Compared with other candidate technologies such as active relays, RIS has advantages in terms of flexible deployment, economical cost, and high energy efficiency. Thus, RISs have been considered a potential candidate technique for future wireless networks. In this thesis, a wireless network paradigm for the sixth generation (6G) wireless networks is proposed, where RISs are invoked to construct smart radio environments (SRE) to enhance communication performances for mobile users. In addition, beyond the conventional reselecting-only RIS, a novel model of RIS is originally proposed, namely, simultaneous transmitting and reflecting reconfigurable intelligent surface (STAR-RIS). The STAR-RIS splits the incident signal into transmitted and reflected signals, making full utilization of them to generate 360∘360^{\circ} coverage around the STAR-RIS panel, improving the coverage of the RIS. In order to fully exert the channel domination and beamforming ability of the RISs and STAR-RSIs to construct SREs, several machine learning algorithms, including deep learning (DL), deep reinforcement learning (DRL), and federated learning (FL) approaches are developed to optimize the communication performance in respect of sum data rate or energy efficiency for the RIS-assisted networks. Specifically, several problems are investigated including 1) the passive beamforming problem of the RIS with consideration of configuration overhead is resolved by a DL and a DRL algorithm, where the time overhead of configuration of RIS is successfully reduced by the machine learning algorithms. Consequently, the throughput during a time frame improved 95.2%95.2\% by invoking the proposed algorithms; 2) a novel framework of mobile RISs-enhanced indoor wireless networks is proposed, and a FL enhanced DRL algorithm is proposed for the deployment and beamforming optimization of the RIS. The average throughput of the indoor users severed by the mobile RIS is improved 15.1%15.1\% compared to the case of conventional fixed RIS; 3) A STAR-RIS assisted multi-user downlink multiple-input single-output (MISO) communication system is investigated, and a pair of hybrid reinforcement learning algorithms are proposed for the hybrid control of the transmitting and reflecting beamforming of the STAR-RIS, which ameliorate 7%7\% of the energy efficiency of the STAR-RIS assisted networks; 4) A tile-based low complexity beamforming approach is proposed for STAR-RISs, and the proposed tile-based beamforming approach is capable of achieving homogeneous data rate performance with element-based beamforming with appreciable lower complexity. By designing and operating the computer simulation, this thesis demonstrated 1) the performance gain in terms of sum data rate or energy efficiency by invoking the proposed RIS in the wireless communication networks; 2) the data rate or energy efficient performance gain of the proposed STAR-RIS compared to the existing reflecting-only RIS; 3) the effect of the proposed machine learning algorithms in terms of convergence rate, optimality, and complexity compared to the benchmarks of existing algorithms

    Deep Reinforcement Learning for Multi-user Massive MIMO with Channel Aging

    Full text link
    The design of beamforming for downlink multi-user massive multi-input multi-output (MIMO) relies on accurate downlink channel state information (CSI) at the transmitter (CSIT). In fact, it is difficult for the base station (BS) to obtain perfect CSIT due to user mobility, latency/feedback delay (between downlink data transmission and CSI acquisition). Hence, robust beamforming under imperfect CSIT is needed. In this paper, considering multiple antennas at all nodes (base station and user terminals), we develop a multi-agent deep reinforcement learning (DRL) framework for massive MIMO under imperfect CSIT, where the transmit and receive beamforming are jointly designed to maximize the average information rate of all users. Leveraging this DRL-based framework, interference management is explored and three DRL-based schemes, namely the distributed-learning-distributed-processing scheme, partial-distributed-learning-distributed-processing, and central-learning-distributed-processing scheme, are proposed and analyzed. This paper \textrm{1)} highlights the fact that the DRL-based strategies outperform the random action-chosen strategy and the delay-sensitive strategy named as sample-and-hold (SAH) approach, and achieved over 90%\% of the information rate of two selected benchmarks with lower complexity: the zero-forcing channel-inversion (ZF-CI) with perfect CSIT and the Greedy Beam Selection strategy, \textrm{2)} demonstrates the inherent robustness of the proposed designs in the presence of user mobility.Comment: submitted for publicatio

    On the Road to 6G: Visions, Requirements, Key Technologies and Testbeds

    Get PDF
    Fifth generation (5G) mobile communication systems have entered the stage of commercial development, providing users with new services and improved user experiences as well as offering a host of novel opportunities to various industries. However, 5G still faces many challenges. To address these challenges, international industrial, academic, and standards organizations have commenced research on sixth generation (6G) wireless communication systems. A series of white papers and survey papers have been published, which aim to define 6G in terms of requirements, application scenarios, key technologies, etc. Although ITU-R has been working on the 6G vision and it is expected to reach a consensus on what 6G will be by mid-2023, the related global discussions are still wide open and the existing literature has identified numerous open issues. This paper first provides a comprehensive portrayal of the 6G vision, technical requirements, and application scenarios, covering the current common understanding of 6G. Then, a critical appraisal of the 6G network architecture and key technologies is presented. Furthermore, existing testbeds and advanced 6G verification platforms are detailed for the first time. In addition, future research directions and open challenges are identified for stimulating the on-going global debate. Finally, lessons learned to date concerning 6G networks are discussed
    • …
    corecore