3,581 research outputs found

    An Overview on Application of Machine Learning Techniques in Optical Networks

    Get PDF
    Today's telecommunication networks have become sources of enormous amounts of widely heterogeneous data. This information can be retrieved from network traffic traces, network alarms, signal quality indicators, users' behavioral data, etc. Advanced mathematical tools are required to extract meaningful information from these data and take decisions pertaining to the proper functioning of the networks from the network-generated data. Among these mathematical tools, Machine Learning (ML) is regarded as one of the most promising methodological approaches to perform network-data analysis and enable automated network self-configuration and fault management. The adoption of ML techniques in the field of optical communication networks is motivated by the unprecedented growth of network complexity faced by optical networks in the last few years. Such complexity increase is due to the introduction of a huge number of adjustable and interdependent system parameters (e.g., routing configurations, modulation format, symbol rate, coding schemes, etc.) that are enabled by the usage of coherent transmission/reception technologies, advanced digital signal processing and compensation of nonlinear effects in optical fiber propagation. In this paper we provide an overview of the application of ML to optical communications and networking. We classify and survey relevant literature dealing with the topic, and we also provide an introductory tutorial on ML for researchers and practitioners interested in this field. Although a good number of research papers have recently appeared, the application of ML to optical networks is still in its infancy: to stimulate further work in this area, we conclude the paper proposing new possible research directions

    Genetic Algorithm-Holt-Winters Based Minute Spectrum Occupancy Prediction: An Investigation

    Get PDF
    In this research, the suitability of a genetic algorithm (GA) modified Holt-Winters (HW) exponential model for the prediction of spectrum occupancy data was investigated. Firstly, a description of spectrum measurement that was done during a two-week duration at locations (8.511 °N, 4.594 °E) and (8.487 °N, 4.573 °E) of the 900 MHz and 1800 MHz bands is given. In computing the spectrum duty cycle, different decision thresholds per band link were employed due to differing noise levels. A frequency point with a power spectral density less than the decision threshold was considered unoccupied and was assigned a value of 0, while a frequency point with a power spectral density larger than the decision threshold was considered occupied and was assigned a value of 1. Secondly, the spectrum duty cycle was used in the evaluation of the forecast behavior of the forecasting methods. The HW approach uses exponential smoothing to encode the spectrum data and uses them to forecast typical values in present and future states. The mean square error (MSE) of prediction was minimized using a GA by iteratively adjusting the HW discount factors to improve the forecast accuracy. A decrease in MSE of between 8.33 to 44.6% was observed

    Genetic Algorithm-Holt-Winters Based Minute Spectrum Occupancy Prediction: An Investigation

    Get PDF
    In this research, the suitability of a genetic algorithm (GA) modified Holt-Winters (HW) exponential model for the prediction of spectrum occupancy data was investigated. Firstly, a description of spectrum measurement that was done during a two-week duration at locations (8.511 °N, 4.594 °E) and (8.487 °N, 4.573 °E) of the 900 MHz and 1800 MHz bands is given. In computing the spectrum duty cycle, different decision thresholds per band link were employed due to differing noise levels. A frequency point with a power spectral density less than the decision threshold was considered unoccupied and was assigned a value of 0, while a frequency point with a power spectral density larger than the decision threshold was considered occupied and was assigned a value of 1. Secondly, the spectrum duty cycle was used in the evaluation of the forecast behavior of the forecasting methods. The HW approach uses exponential smoothing to encode the spectrum data and uses them to forecast typical values in present and future states. The mean square error (MSE) of prediction was minimized using a GA by iteratively adjusting the HW discount factors to improve the forecast accuracy. A decrease in MSE of between 8.33 to 44.6% was observed

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig
    corecore