43 research outputs found

    Towards versatile access networks (Chapter 3)

    Get PDF
    Compared to its previous generations, the 5th generation (5G) cellular network features an additional type of densification, i.e., a large number of active antennas per access point (AP) can be deployed. This technique is known as massive multipleinput multiple-output (mMIMO) [1]. Meanwhile, multiple-input multiple-output (MIMO) evolution, e.g., in channel state information (CSI) enhancement, and also on the study of a larger number of orthogonal demodulation reference signal (DMRS) ports for MU-MIMO, was one of the Release 18 of 3rd generation partnership project (3GPP Rel-18) work item. This release (3GPP Rel-18) package approval, in the fourth quarter of 2021, marked the start of the 5G Advanced evolution in 3GPP. The other items in 3GPP Rel-18 are to study and add functionality in the areas of network energy savings, coverage, mobility support, multicast broadcast services, and positionin

    State-of-the-art assessment of 5G mmWave communications

    Get PDF
    Deliverable D2.1 del proyecto 5GWirelessMain objective of the European 5Gwireless project, which is part of the H2020 Marie Slodowska- Curie ITN (Innovative Training Networks) program resides in the training and involvement of young researchers in the elaboration of future mobile communication networks, focusing on innovative wireless technologies, heterogeneous network architectures, new topologies (including ultra-dense deployments), and appropriate tools. The present Document D2.1 is the first deliverable of Work- Package 2 (WP2) that is specifically devoted to the modeling of the millimeter-wave (mmWave) propagation channels, and development of appropriate mmWave beamforming and signal processing techniques. Deliver D2.1 gives a state-of-the-art on the mmWave channel measurement, characterization and modeling; existing antenna array technologies, channel estimation and precoding algorithms; proposed deployment and networking techniques; some performance studies; as well as a review on the evaluation and analysis toolsPostprint (published version

    Study, Measurements and Characterisation of a 5G system using a Mobile Network Operator Testbed

    Get PDF
    The goals for 5G are aggressive. It promises to deliver enhanced end-user experience by offering new applications and services through gigabit speeds, and significantly improved performance and reliability. The enhanced mobile broadband (eMBB) 5G use case, for instance, targets peak data rates as high as 20 Gbps in the downlink (DL) and 10 Gbps in the uplink (UL). While there are different ways to improve data rates, spectrum is at the core of enabling higher mobile broadband data rates. 5G New Radio (NR) specifies new frequency bands below 6 GHz and also extends into mmWave frequencies where more contiguous bandwidth is available for sending lots of data. However, at mmWave frequencies, signals are more susceptible to impairments. Hence, extra consideration is needed to determine test approaches that provide the precision required to accurately evaluate 5G components and devices. Therefore, the aim of the thesis is to provide a deep dive into 5G technology, explore its testing and validation, and thereafter present the OTE (Hellenic Telecommunications Organisation) 5G testbed, including measurement results obtained and its characterisation based on key performance indicators (KPIs)

    Federated learning empowered ultra-dense next-generation wireless networks

    Get PDF
    The evolution of wireless networks, from first-generation (1G) to fifth generation (5G), has facilitated real-time services and intelligent applications powered by artificial intelligence (AI) and machine learning (ML). Nevertheless, prospective applications like autonomous driving and haptic communications necessitate the exploration of beyond fifth-generation (B5G) and sixth-generation (6G) networks, leveraging millimeter-wave (mmWave) and terahertz (THz) technologies. However, these high-frequency bands experience significant atmospheric attenuation, resulting in high signal propagation loss, which necessitates a fundamental reconfiguration of network architectures and paves the way for the emergence of ultra-dense networks (UDNs). Equipped with massive multiple-input multiple-output (mMIMO) and beamforming technologies, UDNs mitigate propagation losses by utilising narrow line-of-sight (LoS) beams to direct radio waves toward specific receiving points, thereby enhancing signal quality. Despite these advancements, UDNs face critical challenges, which include worsened mobility issues in dynamic UDNs due to the susceptibility of LoS links to blockages, data privacy concerns at the network edge when implementing centralised ML training, and power consumption challenges stemming from the deployment of dense small base stations (SBSs) and the integration of cutting edge techniques like edge learning. In this context, this thesis begins by investigating the prevailing issue of beam blockage in UDNs and introduces novel frameworks to address this emerging challenge. The main theme of the first three contributions is to tackle beam blockages and frequent handovers (HOs) through innovative sensing-aided wireless communications. This approach seeks to enhance the situational awareness of UDNs regarding their surroundings by using a variety of sensors commonly found in urban areas, such as vision and radar sensors. While all these contributions share the common goal of proposing sensing-aided proactive HO (PHO) frameworks that intelligently predict blockage events in advance and performs PHO, each of them presents distinctive framework features, contributing significantly to the improvement of UDN operations. To provide further details, the first contribution adhered to conventional centralised model training, while the other contributions employed federated learning (FL), a decentralised collaborative training approach primarily designed to safeguard data privacy. The utilisation of FL technology offers several advantages, including enhanced data privacy, scalability, and adaptability. Simulation results from all these frameworks have demonstrated the remarkable performance of the proposed latency-aware frameworks in improving UDNs’ reliability, maintaining user connectivity, and delivering high levels of quality of experience (QoE) and throughput when compared to existing reactive HO procedures lacking proactive blockage prediction. The fourth contribution is centred on optimising energy management in UDNs and introduces FedraTrees, a lightweight algorithm that integrates decision tree (DT)-based models into the FL setup. FedraTrees challenges the conventional belief that FL is exclusively suited for Neural Network (NN) models by enabling the incorporation of DT models within the FL context. While FedraTrees offers versatility across various applications, this thesis specifically applies it to energy forecasting tasks with the aim of achieving the energy efficiency requirement of UDNs. Simulation results demonstrate that FedraTrees performs remarkably in predicting short-term energy patterns and surpasses the state-of-the-art long short-term memory (LSTM)-based federated averaging (FedAvg) algorithm in terms of reducing computational and communication resources demands

    Machine learning enabled millimeter wave cellular system and beyond

    Get PDF
    Millimeter-wave (mmWave) communication with advantages of abundant bandwidth and immunity to interference has been deemed a promising technology for the next generation network and beyond. With the help of mmWave, the requirements envisioned of the future mobile network could be met, such as addressing the massive growth required in coverage, capacity as well as traffic, providing a better quality of service and experience to users, supporting ultra-high data rates and reliability, and ensuring ultra-low latency. However, due to the characteristics of mmWave, such as short transmission distance, high sensitivity to the blockage, and large propagation path loss, there are some challenges for mmWave cellular network design. In this context, to enjoy the benefits from the mmWave networks, the architecture of next generation cellular network will be more complex. With a more complex network, it comes more complex problems. The plethora of possibilities makes planning and managing a complex network system more difficult. Specifically, to provide better Quality of Service and Quality of Experience for users in the such network, how to provide efficient and effective handover for mobile users is important. The probability of handover trigger will significantly increase in the next generation network, due to the dense small cell deployment. Since the resources in the base station (BS) is limited, the handover management will be a great challenge. Further, to generate the maximum transmission rate for the users, Line-of-sight (LOS) channel would be the main transmission channel. However, due to the characteristics of mmWave and the complexity of the environment, LOS channel is not feasible always. Non-line-of-sight channel should be explored and used as the backup link to serve the users. With all the problems trending to be complex and nonlinear, and the data traffic dramatically increasing, the conventional method is not effective and efficiency any more. In this case, how to solve the problems in the most efficient manner becomes important. Therefore, some new concepts, as well as novel technologies, require to be explored. Among them, one promising solution is the utilization of machine learning (ML) in the mmWave cellular network. On the one hand, with the aid of ML approaches, the network could learn from the mobile data and it allows the system to use adaptable strategies while avoiding unnecessary human intervention. On the other hand, when ML is integrated in the network, the complexity and workload could be reduced, meanwhile, the huge number of devices and data could be efficiently managed. Therefore, in this thesis, different ML techniques that assist in optimizing different areas in the mmWave cellular network are explored, in terms of non-line-of-sight (NLOS) beam tracking, handover management, and beam management. To be specific, first of all, a procedure to predict the angle of arrival (AOA) and angle of departure (AOD) both in azimuth and elevation in non-line-of-sight mmWave communications based on a deep neural network is proposed. Moreover, along with the AOA and AOD prediction, a trajectory prediction is employed based on the dynamic window approach (DWA). The simulation scenario is built with ray tracing technology and generate data. Based on the generated data, there are two deep neural networks (DNNs) to predict AOA/AOD in the azimuth (AAOA/AAOD) and AOA/AOD in the elevation (EAOA/EAOD). Furthermore, under an assumption that the UE mobility and the precise location is unknown, UE trajectory is predicted and input into the trained DNNs as a parameter to predict the AAOA/AAOD and EAOA/EAOD to show the performance under a realistic assumption. The robustness of both procedures is evaluated in the presence of errors and conclude that DNN is a promising tool to predict AOA and AOD in a NLOS scenario. Second, a novel handover scheme is designed aiming to optimize the overall system throughput and the total system delay while guaranteeing the quality of service (QoS) of each user equipment (UE). Specifically, the proposed handover scheme called O-MAPPO integrates the reinforcement learning (RL) algorithm and optimization theory. An RL algorithm known as multi-agent proximal policy optimization (MAPPO) plays a role in determining handover trigger conditions. Further, an optimization problem is proposed in conjunction with MAPPO to select the target base station and determine beam selection. It aims to evaluate and optimize the system performance of total throughput and delay while guaranteeing the QoS of each UE after the handover decision is made. Third, a multi-agent RL-based beam management scheme is proposed, where multiagent deep deterministic policy gradient (MADDPG) is applied on each small-cell base station (SCBS) to maximize the system throughput while guaranteeing the quality of service. With MADDPG, smart beam management methods can serve the UEs more efficiently and accurately. Specifically, the mobility of UEs causes the dynamic changes of the network environment, the MADDPG algorithm learns the experience of these changes. Based on that, the beam management in the SCBS is optimized according the reward or penalty when severing different UEs. The approach could improve the overall system throughput and delay performance compared with traditional beam management methods. The works presented in this thesis demonstrate the potentiality of ML when addressing the problem from the mmWave cellular network. Moreover, it provides specific solutions for optimizing NLOS beam tracking, handover management and beam management. For NLOS beam tracking part, simulation results show that the prediction errors of the AOA and AOD can be maintained within an acceptable range of ±2. Further, when it comes to the handover optimization part, the numerical results show the system throughput and delay are improved by 10% and 25%, respectively, when compared with two typical RL algorithms, Deep Deterministic Policy Gradient (DDPG) and Deep Q-learning (DQL). Lastly, when it considers the intelligent beam management part, numerical results reveal the convergence performance of the MADDPG and the superiority in improving the system throughput compared with other typical RL algorithms and the traditional beam management method

    6G Wireless Systems: Vision, Requirements, Challenges, Insights, and Opportunities

    Full text link
    Mobile communications have been undergoing a generational change every ten years or so. However, the time difference between the so-called "G's" is also decreasing. While fifth-generation (5G) systems are becoming a commercial reality, there is already significant interest in systems beyond 5G, which we refer to as the sixth-generation (6G) of wireless systems. In contrast to the already published papers on the topic, we take a top-down approach to 6G. We present a holistic discussion of 6G systems beginning with lifestyle and societal changes driving the need for next generation networks. This is followed by a discussion into the technical requirements needed to enable 6G applications, based on which we dissect key challenges, as well as possibilities for practically realizable system solutions across all layers of the Open Systems Interconnection stack. Since many of the 6G applications will need access to an order-of-magnitude more spectrum, utilization of frequencies between 100 GHz and 1 THz becomes of paramount importance. As such, the 6G eco-system will feature a diverse range of frequency bands, ranging from below 6 GHz up to 1 THz. We comprehensively characterize the limitations that must be overcome to realize working systems in these bands; and provide a unique perspective on the physical, as well as higher layer challenges relating to the design of next generation core networks, new modulation and coding methods, novel multiple access techniques, antenna arrays, wave propagation, radio-frequency transceiver design, as well as real-time signal processing. We rigorously discuss the fundamental changes required in the core networks of the future that serves as a major source of latency for time-sensitive applications. While evaluating the strengths and weaknesses of key 6G technologies, we differentiate what may be achievable over the next decade, relative to what is possible.Comment: Accepted for Publication into the Proceedings of the IEEE; 32 pages, 10 figures, 5 table

    Interface Selection in 5G vehicular networks

    Get PDF
    ITA Negli ultimi anni, la quantità di dati condivisa nel mondo è aumentata esponenzialmente grazie alle applicazioni innovative che riguardano la sicurezza (e.g. domotica, smart cities, controllo del traffico stradale, veicoli autonomi) e i servizi di intrattenimento (e.g. audio e video streaming, ricerche web, videogiochi online di massa). Per supportare questo trend, le principali compagnie nell’industria delle telecomunicazioni stanno sviluppando nuovi standard che saranno disponibili agli utenti finali nei prossimi anni e che saranno presentati come la Quinta Generazione di Reti Cellulari (5G). Questi standard prevedono miglioramenti ai precedenti standard 4G (e.g. LTE, WiMax, DSRC) e tecnologie completamente nuove (e.g. onde millimetriche, comunicazione con luce visibile) per permettere la diffusione di nuovi servizi che richiedono un throughput estremamente alto e una latency bassa. Nella maggior parte dei casi, queste tecnologie dovranno cooperare per assicurare una rete affidabile e accessibile in ogni situazione. Una delle applicazioni più promettenti di questa nuova generazione di tecnologie sono le reti veicolari, un insieme di servizi che includono la comunicazione con le infrastrutture, come il download di un film da Internet o la ricezione di informazioni riguardanti l’ambiente circostante (e.g. un semaforo manda un messaggio a un veicolo in avvicinamento per farlo fermare), o la comunicazione direttamente tra veicoli, in questo caso il datarate è tipicamente più basso dato che l’uso più tipico sarà, per esempio, mandare informazioni riguardanti le macchine più vicine per fare in modo di diminuore il numero di incidenti stradali o gestire il traffico. Questa tesi è focalizzata sulle applicazioni per reti veicolari, l’obiettivo è di analizzare le prestazioni del protocollo IEEE 802.11p a diversi datarate in un tipico scenario V2V, e di confrontare LTE e mmWaves usando una comunicazione V2I in diverse circostanze, per mostrare come ogni tecnologia offra vantaggi per determinate applicazioni mentre non è adatta per altre. ENG In the last years, the amount of data shared among the world is increased exponentially thanks to the novel applications for security (e.g. home automation, smart cities, traffic control, autonomous vehicles) and infotainment (e.g. audio and video streaming, web browsing, massive online videogames). To support this trend, the major companies in the telecommunication industry are developing new standards that will be available to the final users in the next years and that will be presented as the Fifth Generation of Cellular Networks (5G). These standards provide improvements to the 4G standards (e.g. LTE, WiMax, DSRC) and brand new technologies (e.g. mmWaves, Visible Light Communication) to enable new services that demand extremely high throughput and low latency. In most cases these technologies will cooperate to ensure a reliable and accessible network in every situation. One of the most promising applications of these new generation technologies is vehicular networks, a set of services that includes the communication with infrastructures, such as the download of a film from the Internet or the reception of information about the surrounding environment (e.g. a traffic light sends a message to an incoming vehicle to make it stop), or the communication between vehicles, in this case the datarate is tipically lower since the typical use will be, for example, to send information about the closest cars in order to decrease the number of accidents or to manage the traffic. This thesis is focalized on the vehicular networks applications, it aims to analyze the performance of IEEE 802.11p protocol at different datarates in a typical V2V scenario, and to compare LTE and mmWaves using a V2I communication in different circumstances to show how each technology offers advantages for some applications while is not suitable for others
    corecore