3,554 research outputs found

    An Overview on Application of Machine Learning Techniques in Optical Networks

    Get PDF
    Today's telecommunication networks have become sources of enormous amounts of widely heterogeneous data. This information can be retrieved from network traffic traces, network alarms, signal quality indicators, users' behavioral data, etc. Advanced mathematical tools are required to extract meaningful information from these data and take decisions pertaining to the proper functioning of the networks from the network-generated data. Among these mathematical tools, Machine Learning (ML) is regarded as one of the most promising methodological approaches to perform network-data analysis and enable automated network self-configuration and fault management. The adoption of ML techniques in the field of optical communication networks is motivated by the unprecedented growth of network complexity faced by optical networks in the last few years. Such complexity increase is due to the introduction of a huge number of adjustable and interdependent system parameters (e.g., routing configurations, modulation format, symbol rate, coding schemes, etc.) that are enabled by the usage of coherent transmission/reception technologies, advanced digital signal processing and compensation of nonlinear effects in optical fiber propagation. In this paper we provide an overview of the application of ML to optical communications and networking. We classify and survey relevant literature dealing with the topic, and we also provide an introductory tutorial on ML for researchers and practitioners interested in this field. Although a good number of research papers have recently appeared, the application of ML to optical networks is still in its infancy: to stimulate further work in this area, we conclude the paper proposing new possible research directions

    Artificial intelligence (AI) methods in optical networks: A comprehensive survey

    Get PDF
    Producción CientíficaArtificial intelligence (AI) is an extensive scientific discipline which enables computer systems to solve problems by emulating complex biological processes such as learning, reasoning and self-correction. This paper presents a comprehensive review of the application of AI techniques for improving performance of optical communication systems and networks. The use of AI-based techniques is first studied in applications related to optical transmission, ranging from the characterization and operation of network components to performance monitoring, mitigation of nonlinearities, and quality of transmission estimation. Then, applications related to optical network control and management are also reviewed, including topics like optical network planning and operation in both transport and access networks. Finally, the paper also presents a summary of opportunities and challenges in optical networking where AI is expected to play a key role in the near future.Ministerio de Economía, Industria y Competitividad (Project EC2014-53071-C3-2-P, TEC2015-71932-REDT

    Multilayer optical learning networks

    Get PDF
    A new approach to learning in a multilayer optical neural network based on holographically interconnected nonlinear devices is presented. The proposed network can learn the interconnections that form a distributed representation of a desired pattern transformation operation. The interconnections are formed in an adaptive and self-aligning fashioias volume holographic gratings in photorefractive crystals. Parallel arrays of globally space-integrated inner products diffracted by the interconnecting hologram illuminate arrays of nonlinear Fabry-Perot etalons for fast thresholding of the transformed patterns. A phase conjugated reference wave interferes with a backward propagating error signal to form holographic interference patterns which are time integrated in the volume of a photorefractive crystal to modify slowly and learn the appropriate self-aligning interconnections. This multilayer system performs an approximate implementation of the backpropagation learning procedure in a massively parallel high-speed nonlinear optical network

    Dual-Stage Planning for Elastic Optical Networks Integrating Machine-Learning-Assisted QoT Estimation

    Get PDF
    Following the emergence of Elastic Optical Networks (EONs), Machine Learning (ML) has been intensively investigated as a promising methodology to address complex network management tasks, including, e.g., Quality of Transmission (QoT) estimation, fault management, and automatic adjustment of transmission parameters. Though several ML-based solutions for specific tasks have been proposed, how to integrate the outcome of such ML approaches inside Routing and Spectrum Assignment (RSA) models (which address the fundamental planning problem in EONs) is still an open research problem. In this study, we propose a dual-stage iterative RSA optimization framework that incorporates the QoT estimations provided by a ML regressor, used to define lightpaths' reach constraints, into a Mixed Integer Linear Programming (MILP) formulation. The first stage minimizes the overall spectrum occupation, whereas the second stage maximizes the minimum inter-channel spacing between neighbor channels, without increasing the overall spectrum occupation obtained in the previous stage. During the second stage, additional interference constraints are generated, and these constraints are then added to the MILP at the next iteration round to exclude those lightpaths combinations that would exhibit unacceptable QoT. Our illustrative numerical results on realistic EON instances show that the proposed ML-assisted framework achieves spectrum occupation savings up to 52.4% (around 33% on average) in comparison to a traditional MILP-based RSA framework that uses conservative reach constraints based on margined analytical models

    Neural network-assisted decision-making for adaptive routing strategy in optical datacenter networks

    Get PDF
    To improve the blocking probability (BP) performance and enhance the resource utilization, a correct decision of routing strategy which is most adaptable to the network configuration and traffic dynamics is essential for adaptive routing in optical datacenter networks (DCNs). A neural network (NN)-assisted decision-making scheme is proposed to find the optimal routing strategy in optical DCNs by predicting the BP performance for various candidate routing strategies. The features of an optical DCN architecture (i.e., the rack number N, connection degree D, spectral slot number S and optical transceiver number M) and the traffic pattern (i.e., the ratio of requests of various capacities R, and the load of arriving request) are used as the input to the NN to estimate the optimal routing strategy. A case of two-strategy decision in the transparent optical multi-hop interconnected DCN is studied. Three metrics are defined for performance evaluation, which include (a) the ratio of the load range with wrong decision over the whole load range of interest (i.e., decision error E), (b) the maximum BP loss (BPL) and (c) the resource utilization loss (UL) caused by the wrong decision. Numerical results show that the ratio of error-free cases over tested cases always surpasses 83% and the average values of E, BPL and UL are less than 3.0%, 4.0% and 1.2%, respectively, which implies the high accuracy of the proposed scheme. The results validate the feasibility of the proposed scheme which facilitates the autonomous implementation of adaptive routing in optical DCNs

    Evolution towards Smart Optical Networking: Where Artificial Intelligence (AI) meets the World of Photonics

    Full text link
    Smart optical networks are the next evolution of programmable networking and programmable automation of optical networks, with human-in-the-loop network control and management. The paper discusses this evolution and the role of Artificial Intelligence (AI)

    Machine learning-based routing and wavelength assignment in software-defined optical networks

    Get PDF
    Recently, machine learning (ML) has attracted the attention of both researchers and practitioners to address several issues in the optical networking field. This trend has been mainly driven by the huge amount of available data (i.e., signal quality indicators, network alarms, etc.) and to the large number of optimization parameters which feature current optical networks (such as, modulation format, lightpath routes, transport wavelength, etc.). In this paper, we leverage the techniques from the ML discipline to efficiently accomplish the routing and wavelength assignment (RWA) for an input traffic matrix in an optical WDM network. Numerical results show that near-optimal RWA can be obtained with our approach, while reducing computational time up to 93% in comparison to a traditional optimization approach based on integer linear programming. Moreover, to further demonstrate the effectiveness of our approach, we deployed the ML classifier into an ONOS-based software defined optical network laboratory testbed, where we evaluate the performance of the overall RWA process in terms of computational time.The authors would like to acknowl-edge the support of the project TEXEO (TEC2016-80339-R), funded by Spanish MINECO and the EU-H2020 Metrohaul project (grant no. 761727)
    corecore