361 research outputs found

    An Overview on Application of Machine Learning Techniques in Optical Networks

    Get PDF
    Today's telecommunication networks have become sources of enormous amounts of widely heterogeneous data. This information can be retrieved from network traffic traces, network alarms, signal quality indicators, users' behavioral data, etc. Advanced mathematical tools are required to extract meaningful information from these data and take decisions pertaining to the proper functioning of the networks from the network-generated data. Among these mathematical tools, Machine Learning (ML) is regarded as one of the most promising methodological approaches to perform network-data analysis and enable automated network self-configuration and fault management. The adoption of ML techniques in the field of optical communication networks is motivated by the unprecedented growth of network complexity faced by optical networks in the last few years. Such complexity increase is due to the introduction of a huge number of adjustable and interdependent system parameters (e.g., routing configurations, modulation format, symbol rate, coding schemes, etc.) that are enabled by the usage of coherent transmission/reception technologies, advanced digital signal processing and compensation of nonlinear effects in optical fiber propagation. In this paper we provide an overview of the application of ML to optical communications and networking. We classify and survey relevant literature dealing with the topic, and we also provide an introductory tutorial on ML for researchers and practitioners interested in this field. Although a good number of research papers have recently appeared, the application of ML to optical networks is still in its infancy: to stimulate further work in this area, we conclude the paper proposing new possible research directions

    Artificial intelligence (AI) methods in optical networks: A comprehensive survey

    Get PDF
    Producción CientíficaArtificial intelligence (AI) is an extensive scientific discipline which enables computer systems to solve problems by emulating complex biological processes such as learning, reasoning and self-correction. This paper presents a comprehensive review of the application of AI techniques for improving performance of optical communication systems and networks. The use of AI-based techniques is first studied in applications related to optical transmission, ranging from the characterization and operation of network components to performance monitoring, mitigation of nonlinearities, and quality of transmission estimation. Then, applications related to optical network control and management are also reviewed, including topics like optical network planning and operation in both transport and access networks. Finally, the paper also presents a summary of opportunities and challenges in optical networking where AI is expected to play a key role in the near future.Ministerio de Economía, Industria y Competitividad (Project EC2014-53071-C3-2-P, TEC2015-71932-REDT

    Machine learning-based routing and wavelength assignment in software-defined optical networks

    Get PDF
    Recently, machine learning (ML) has attracted the attention of both researchers and practitioners to address several issues in the optical networking field. This trend has been mainly driven by the huge amount of available data (i.e., signal quality indicators, network alarms, etc.) and to the large number of optimization parameters which feature current optical networks (such as, modulation format, lightpath routes, transport wavelength, etc.). In this paper, we leverage the techniques from the ML discipline to efficiently accomplish the routing and wavelength assignment (RWA) for an input traffic matrix in an optical WDM network. Numerical results show that near-optimal RWA can be obtained with our approach, while reducing computational time up to 93% in comparison to a traditional optimization approach based on integer linear programming. Moreover, to further demonstrate the effectiveness of our approach, we deployed the ML classifier into an ONOS-based software defined optical network laboratory testbed, where we evaluate the performance of the overall RWA process in terms of computational time.The authors would like to acknowl-edge the support of the project TEXEO (TEC2016-80339-R), funded by Spanish MINECO and the EU-H2020 Metrohaul project (grant no. 761727)

    Assessment of Cross-train Machine Learning Techniques for QoT-Estimation in agnostic Optical Networks

    Get PDF
    With the evolution of 5G technology, high definition video, virtual reality, and the internet of things (IoT), the demand for high capacity optical networks has been increasing dramatically. To support the capacity demand, low-margin optical networks engage operator interest. To engross this techno-economic interest, planning tools with higher accuracy and accurate models for the quality of transmission estimation (QoT-E) are needed. However, considering the state-of-the-art optical network’s heterogeneity, it is challenging to develop such an accurate planning tool and low-margin QoT-E models using the traditional analytical approach. Fortunately, data-driven machine-learning (ML) cognition provides a promising path. This paper reports the use of cross-trained ML-based learning methods to predict the QoT of an un-established lightpath (LP) in an agnostic network based on the retrieved data from already established LPs of an in-service network. This advanced prediction of the QoT of un-established LP in an agnostic network is a key enabler not only for the optimal planning of this network but it also provides the opportunity to automatically deploy the LPs with a minimum margin in a reliable manner. The QoT metric of the LPs are defined by the generalized signal-to-noise ratio (GSNR), which includes the effect of both amplified spontaneous emission (ASE) noise and non-linear interference (NLI) accumulation. The real field data is mimicked by using a well reliable and tested network simulation tool GNPy. Using the generated synthetic data set, supervised ML techniques such as wide deep neural network, deep neural network, multi-layer perceptron regressor, boasted tree regressor, decision tree regressor, and random forest regressor are applied, demonstrating the GSNR prediction of an un-established LP in an agnostic network with a maximum error of 0.40 dB

    Cross-feature trained machine learning models for QoT-estimation in optical networks

    Get PDF
    The ever-increasing demand for global internet traffic, together with evolving concepts of software-defined networks and elastic-optical-networks, demand not only the total capacity utilization of underlying infrastructure but also a dynamic, flexible, and transparent optical network. In general, worst-case assumptions are utilized to calculate the quality of transmission (QoT) with provisioning of high-margin requirements. Thus, precise estimation of the QoT for the lightpath (LP) establishment is crucial for reducing the provisioning margins. We propose and compare several data-driven machine learning (ML) models to make an accurate calculation of the QoT before the actual establishment of the LP in an unseen network. The proposed models are trained on the data acquired from an already established LP of a completely different network. The metric considered to evaluate the QoT of the LP is the generalized signal-to-noise ratio (GSNR), which accumulates the impact of both nonlinear interference and amplified spontaneous emission noise. The dataset is generated synthetically using a well-tested GNPy simulation tool. Promising results are achieved, showing that the proposed neural network considerably minimizes the GSNR uncertainty and, consequently, the provisioning margin. Furthermore, we also analyze the impact of cross-features and relevant features training on the proposed ML models’ performance

    Smart Sensor Technologies for IoT

    Get PDF
    The recent development in wireless networks and devices has led to novel services that will utilize wireless communication on a new level. Much effort and resources have been dedicated to establishing new communication networks that will support machine-to-machine communication and the Internet of Things (IoT). In these systems, various smart and sensory devices are deployed and connected, enabling large amounts of data to be streamed. Smart services represent new trends in mobile services, i.e., a completely new spectrum of context-aware, personalized, and intelligent services and applications. A variety of existing services utilize information about the position of the user or mobile device. The position of mobile devices is often achieved using the Global Navigation Satellite System (GNSS) chips that are integrated into all modern mobile devices (smartphones). However, GNSS is not always a reliable source of position estimates due to multipath propagation and signal blockage. Moreover, integrating GNSS chips into all devices might have a negative impact on the battery life of future IoT applications. Therefore, alternative solutions to position estimation should be investigated and implemented in IoT applications. This Special Issue, “Smart Sensor Technologies for IoT” aims to report on some of the recent research efforts on this increasingly important topic. The twelve accepted papers in this issue cover various aspects of Smart Sensor Technologies for IoT
    corecore