931 research outputs found

    Artificial Intelligence Enabled Wireless Networking for 5G and Beyond: Recent Advances and Future Challenges

    Get PDF
    5G wireless communication networks are currently being deployed, and B5G networks are expected to be developed over the next decade. AI technologies and, in particular, ML have the potential to efficiently solve the unstructured and seemingly intractable problems by involving large amounts of data that need to be dealt with in B5G. This article studies how AI and ML can be leveraged for the design and operation of B5G networks. We first provide a comprehensive survey of recent advances and future challenges that result from bringing AI/ML technologies into B5G wireless networks. Our survey touches on different aspects of wireless network design and optimization, including channel measurements, modeling, and estimation, physical layer research, and network management and optimization. Then ML algorithms and applications to B5G networks are reviewed, followed by an overview of standard developments of applying AI/ML algorithms to B5G networks. We conclude this study with future challenges on applying AI/ML to B5G networks.Funding Agencies|National Key R&amp;D Program of China [2018YFB1801101]; National Natural Science Foundation of China (NSFC)National Natural Science Foundation of China [61960206006]; High Level Innovation and Entrepreneurial Talent Introduction Program in Jiangsu; Research Fund of National Mobile Communications Research Laboratory, Southeast University [2020B01]; Fundamental Research Funds for the Central UniversitiesFundamental Research Funds for the Central Universities [2242019R30001]; EU H2020 RISE TESTBED2 project [872172]</p

    An Overview on Application of Machine Learning Techniques in Optical Networks

    Get PDF
    Today's telecommunication networks have become sources of enormous amounts of widely heterogeneous data. This information can be retrieved from network traffic traces, network alarms, signal quality indicators, users' behavioral data, etc. Advanced mathematical tools are required to extract meaningful information from these data and take decisions pertaining to the proper functioning of the networks from the network-generated data. Among these mathematical tools, Machine Learning (ML) is regarded as one of the most promising methodological approaches to perform network-data analysis and enable automated network self-configuration and fault management. The adoption of ML techniques in the field of optical communication networks is motivated by the unprecedented growth of network complexity faced by optical networks in the last few years. Such complexity increase is due to the introduction of a huge number of adjustable and interdependent system parameters (e.g., routing configurations, modulation format, symbol rate, coding schemes, etc.) that are enabled by the usage of coherent transmission/reception technologies, advanced digital signal processing and compensation of nonlinear effects in optical fiber propagation. In this paper we provide an overview of the application of ML to optical communications and networking. We classify and survey relevant literature dealing with the topic, and we also provide an introductory tutorial on ML for researchers and practitioners interested in this field. Although a good number of research papers have recently appeared, the application of ML to optical networks is still in its infancy: to stimulate further work in this area, we conclude the paper proposing new possible research directions

    6G White Paper on Edge Intelligence

    Get PDF
    In this white paper we provide a vision for 6G Edge Intelligence. Moving towards 5G and beyond the future 6G networks, intelligent solutions utilizing data-driven machine learning and artificial intelligence become crucial for several real-world applications including but not limited to, more efficient manufacturing, novel personal smart device environments and experiences, urban computing and autonomous traffic settings. We present edge computing along with other 6G enablers as a key component to establish the future 2030 intelligent Internet technologies as shown in this series of 6G White Papers. In this white paper, we focus in the domains of edge computing infrastructure and platforms, data and edge network management, software development for edge, and real-time and distributed training of ML/AI algorithms, along with security, privacy, pricing, and end-user aspects. We discuss the key enablers and challenges and identify the key research questions for the development of the Intelligent Edge services. As a main outcome of this white paper, we envision a transition from Internet of Things to Intelligent Internet of Intelligent Things and provide a roadmap for development of 6G Intelligent Edge

    Networks, Communication, and Computing Vol. 2

    Get PDF
    Networks, communications, and computing have become ubiquitous and inseparable parts of everyday life. This book is based on a Special Issue of the Algorithms journal, and it is devoted to the exploration of the many-faceted relationship of networks, communications, and computing. The included papers explore the current state-of-the-art research in these areas, with a particular interest in the interactions among the fields

    Results and achievements of the ALLIANCE Project: New network solutions for 5G and beyond

    Get PDF
    Leaving the current 4th generation of mobile communications behind, 5G will represent a disruptive paradigm shift integrating 5G Radio Access Networks (RANs), ultra-high-capacity access/metro/core optical networks, and intra-datacentre (DC) network and computational resources into a single converged 5G network infrastructure. The present paper overviews the main achievements obtained in the ALLIANCE project. This project ambitiously aims at architecting a converged 5G-enabled network infrastructure satisfying those needs to effectively realise the envisioned upcoming Digital Society. In particular, we present two networking solutions for 5G and beyond 5G (B5G), such as Software Defined Networking/Network Function Virtualisation (SDN/NFV) on top of an ultra-high-capacity spatially and spectrally flexible all-optical network infrastructure, and the clean-slate Recursive Inter-Network Architecture (RINA) over packet networks, including access, metro, core and DC segments. The common umbrella of all these solutions is the Knowledge-Defined Networking (KDN)-based orchestration layer which, by implementing Artificial Intelligence (AI) techniques, enables an optimal end-to-end service provisioning. Finally, the cross-layer manager of the ALLIANCE architecture includes two novel elements, namely the monitoring element providing network and user data in real time to the KDN, and the blockchain-based trust element in charge of exchanging reliable and confident information with external domains.This work has been partially funded by the Spanish Ministry of Economy and Competitiveness under contract FEDER TEC2017-90034-C2 (ALLIANCE project) and by the Generalitat de Catalunya under contract 2017SGR-1037 and 2017SGR-605.Peer ReviewedPostprint (published version

    A survey of machine learning techniques applied to self organizing cellular networks

    Get PDF
    In this paper, a survey of the literature of the past fifteen years involving Machine Learning (ML) algorithms applied to self organizing cellular networks is performed. In order for future networks to overcome the current limitations and address the issues of current cellular systems, it is clear that more intelligence needs to be deployed, so that a fully autonomous and flexible network can be enabled. This paper focuses on the learning perspective of Self Organizing Networks (SON) solutions and provides, not only an overview of the most common ML techniques encountered in cellular networks, but also manages to classify each paper in terms of its learning solution, while also giving some examples. The authors also classify each paper in terms of its self-organizing use-case and discuss how each proposed solution performed. In addition, a comparison between the most commonly found ML algorithms in terms of certain SON metrics is performed and general guidelines on when to choose each ML algorithm for each SON function are proposed. Lastly, this work also provides future research directions and new paradigms that the use of more robust and intelligent algorithms, together with data gathered by operators, can bring to the cellular networks domain and fully enable the concept of SON in the near future

    Exploiting optical signal analysis for autonomous communications

    Get PDF
    (English) Optical communications have been extensively investigated and enhanced in the last decades. Nowadays, they are responsible to transport all the data traffic generated around the world, from access to the core network segments. As the data traffic is increasing and changing in both type and patterns, the optical communications networks and systems need to readapt and continuous advances to face the future data traffic demands in an efficient and cost-effective way. This PhD thesis focuses on investigate and analyze the optical signals in order to extract useful knowledge from them to support the autonomous lightpath operation, as well as to lightpath characterization. The first objective of this PhD thesis is to investigate the optical transmission feasibility of optical signals based on high-order modulation formats (MF) and high symbol rates (SR) in hybrid filterless, filtered and flexible optical networks. It is expected a higher physical layer impairments impact on these kinds of optical signals that can lead to degradation of the quality of transmission. In particular, the impact of the optical filter narrowing arising from the node cascade is evaluated. The obtained simulation results for the required optical-signal-to-noise ratio in a cascade up to 10 optical nodes foresee the applicability of these kinds of optical signals in such scenarios. By using high-order MF and high SR, the number of the optical transponders cab be reduced, as well as the spectral efficiency is enhanced. The second objective focuses on MF and SR identification at the optical receiver side to support the autonomous lightpath operation. Nowadays, optical transmitters can generate several optical signal configurations in terms of MF and SR. To increase the autonomous operation of the optical receiver, it is desired it can autonomously recognize the MF and SR of the incoming optical signals. In this PhD thesis, we propose an accurate and low complex MF and SR identification algorithm based on optical signal analysis and minimum Euclidean distance to the expected points when the received signals are decoded with several available MF and SR. The extensive simulation results show remarkable accuracy under several realistic lightpath scenarios, based on different fiber types, including linear and nonlinear noise interference, as well as in single and multicarrier optical systems. The final objective of this PhD thesis is the deployment of a machine learning-based digital twin for optical constellations analysis and modeling. An optical signal along its lightpath in the optical network is impaired by several effects. These effects can be linear, e.g., the noise coming from the optical amplification, or nonlinear ones, e.g., the Kerr effects from the fiber propagation. The optical constellations are a good source of information regarding these effects, both linear and nonlinear. Thus, by an accurate and deep analysis of the received optical signals, visualized in optical constellations, we can extract useful information from them to better understand the several impacts along the crossed lightpath. Furthermore, by learning the different impacts from different optical network elements on the optical signal, we can accurately model it in order to create a partial digital twin of the optical physical layer. The proposed digital twin shows accurate results in modeled lightpaths including both linear and nonlinear interference noise, in several lightpaths configuration, i.e., based on different kind of optical links, optical powers and optical fiber parameters. In addition, the proposed digital twin can be useful to predict quality of transmission metrics, such as bit error rate, in typical lightpath scenarios, as well as to detect possible misconfigurations in optical network elements by cooperation with the software-defined networking controller and monitoring and data analytics agents.(Español) Las comunicaciones ópticas han sido ampliamente investigadas y mejoradas en las últimas décadas. En la actualidad, son las encargadas de transportar la mayoría del tráfico de datos que se genera en todo el mundo, desde el acceso hasta los segmentos de la red troncal. A medida que el tráfico de datos aumenta y cambia tanto en tipo como en patrones, las redes y los sistemas de comunicaciones ópticas necesitan readaptarse y avanzar continuamente para, de una manera eficiente y rentable, hacer frente a las futuras demandas de tráfico de datos. Esta tesis doctoral se centra en investigar y analizar las señales ópticas con el fin de extraer de ellas conocimiento útil para apoyar el funcionamiento autónomo de las conexiones ópticas, así como para su caracterización. El primer objetivo de esta tesis doctoral es investigar la viabilidad de transmisión de señales ópticas basadas en formatos de modulación de alto orden y altas tasas de símbolos en redes ópticas híbridas con y sin filtros. Se espera un mayor impacto de las degradaciones de la capa física en este tipo de señales ópticas que pueden conducir a la degradación de la calidad de transmisión. En particular, se evalúa el impacto de la reducción del ancho de banda del filtro óptico que surge tras atravesar una cascada de nodos. Los resultados de simulación obtenidos para la relación señal óptica/ruido requerida en una cascada de hasta 10 nodos ópticos prevén la aplicabilidad de este tipo de señales ópticas en tales escenarios. Mediante el uso de modulación de alto orden y altas tasas de símbolos, se reduce el número de transpondedores ópticos y se mejora la eficiencia espectral. El segundo objetivo se centra en la identificación de formatos de modulación y tasas de símbolos en el lado del receptor óptico para respaldar la operación autónoma de la conexión óptica. Para aumentar el funcionamiento autónomo del receptor óptico, se desea que pueda reconocer de forma autónoma la configuración de las señales ópticas entrantes. En esta tesis doctoral, proponemos un algoritmo de identificación de formatos de modulación y tasas de símbolos preciso y de baja complejidad basado en el análisis de señales ópticas cuando las señales recibidas se decodifican con varios formatos de modulación y tasas de símbolos disponibles. Los extensos resultados de la simulación muestran una precisión notable en varios escenarios realistas, basados en diferentes tipos de fibra, incluida la interferencia de ruido lineal y no lineal, así como en sistemas ópticos de portadora única y múltiple. El objetivo final de esta tesis doctoral es el despliegue de un gemelo digital basado en aprendizaje automático para el análisis y modelado de constelaciones ópticas. Una señal óptica a lo largo de su trayectoria en la red óptica se ve afectada por varios efectos, pueden ser lineales o no lineales. Las constelaciones ópticas son una buena fuente de información sobre estos efectos, tanto lineales como no lineales. Por lo tanto, mediante un análisis preciso y profundo de las señales ópticas recibidas, visualizadas en constelaciones ópticas, podemos extraer información útil de ellas para comprender mejor los diversos impactos a lo largo del camino propagado. Además, al aprender los diferentes impactos de los diferentes elementos de la red óptica en la señal óptica, podemos modelarla con precisión para crear un gemelo digital parcial de la camada física óptica. El gemelo digital propuesto muestra resultados precisos en conexiones que incluyen ruido de interferencia tanto lineal como no lineal, en varias configuraciones basados en diferentes tipos de enlaces ópticos, potencias ópticas y parámetros de fibra óptica. Además, el gemelo digital propuesto puede ser útil para predecir la calidad de las métricas de transmisión así como para detectar posibles errores de configuración en los elementos de la red óptica mediante la cooperación con el controlador de red, el monitoreo y agentes de análisis de datosPostprint (published version

    Machine Learning Techniques for 5G and beyond

    Get PDF
    Wireless communication systems play a very crucial role in modern society for entertainment, business, commercial, health and safety applications. These systems keep evolving from one generation to next generation and currently we are seeing deployment of fifth generation (5G) wireless systems around the world. Academics and industries are already discussing beyond 5G wireless systems which will be sixth generation (6G) of the evolution. One of the main and key components of 6G systems will be the use of Artificial Intelligence (AI) and Machine Learning (ML) for such wireless networks. Every component and building block of a wireless system that we currently are familiar with from our knowledge of wireless technologies up to 5G, such as physical, network and application layers, will involve one or another AI/ML techniques. This overview paper, presents an up-to-date review of future wireless system concepts such as 6G and role of ML techniques in these future wireless systems. In particular, we present a conceptual model for 6G and show the use and role of ML techniques in each layer of the model. We review some classical and contemporary ML techniques such as supervised and un-supervised learning, Reinforcement Learning (RL), Deep Learning (DL) and Federated Learning (FL) in the context of wireless communication systems. We conclude the paper with some future applications and research challenges in the area of ML and AI for 6G networks. © 2013 IEEE
    corecore