981 research outputs found

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig

    A survey of machine learning techniques applied to self organizing cellular networks

    Get PDF
    In this paper, a survey of the literature of the past fifteen years involving Machine Learning (ML) algorithms applied to self organizing cellular networks is performed. In order for future networks to overcome the current limitations and address the issues of current cellular systems, it is clear that more intelligence needs to be deployed, so that a fully autonomous and flexible network can be enabled. This paper focuses on the learning perspective of Self Organizing Networks (SON) solutions and provides, not only an overview of the most common ML techniques encountered in cellular networks, but also manages to classify each paper in terms of its learning solution, while also giving some examples. The authors also classify each paper in terms of its self-organizing use-case and discuss how each proposed solution performed. In addition, a comparison between the most commonly found ML algorithms in terms of certain SON metrics is performed and general guidelines on when to choose each ML algorithm for each SON function are proposed. Lastly, this work also provides future research directions and new paradigms that the use of more robust and intelligent algorithms, together with data gathered by operators, can bring to the cellular networks domain and fully enable the concept of SON in the near future

    An Overview on Application of Machine Learning Techniques in Optical Networks

    Get PDF
    Today's telecommunication networks have become sources of enormous amounts of widely heterogeneous data. This information can be retrieved from network traffic traces, network alarms, signal quality indicators, users' behavioral data, etc. Advanced mathematical tools are required to extract meaningful information from these data and take decisions pertaining to the proper functioning of the networks from the network-generated data. Among these mathematical tools, Machine Learning (ML) is regarded as one of the most promising methodological approaches to perform network-data analysis and enable automated network self-configuration and fault management. The adoption of ML techniques in the field of optical communication networks is motivated by the unprecedented growth of network complexity faced by optical networks in the last few years. Such complexity increase is due to the introduction of a huge number of adjustable and interdependent system parameters (e.g., routing configurations, modulation format, symbol rate, coding schemes, etc.) that are enabled by the usage of coherent transmission/reception technologies, advanced digital signal processing and compensation of nonlinear effects in optical fiber propagation. In this paper we provide an overview of the application of ML to optical communications and networking. We classify and survey relevant literature dealing with the topic, and we also provide an introductory tutorial on ML for researchers and practitioners interested in this field. Although a good number of research papers have recently appeared, the application of ML to optical networks is still in its infancy: to stimulate further work in this area, we conclude the paper proposing new possible research directions

    Cognitive Radio Systems

    Get PDF
    Cognitive radio is a hot research area for future wireless communications in the recent years. In order to increase the spectrum utilization, cognitive radio makes it possible for unlicensed users to access the spectrum unoccupied by licensed users. Cognitive radio let the equipments more intelligent to communicate with each other in a spectrum-aware manner and provide a new approach for the co-existence of multiple wireless systems. The goal of this book is to provide highlights of the current research topics in the field of cognitive radio systems. The book consists of 17 chapters, addressing various problems in cognitive radio systems

    Genetic Algorithm-Holt-Winters Based Minute Spectrum Occupancy Prediction: An Investigation

    Get PDF
    In this research, the suitability of a genetic algorithm (GA) modified Holt-Winters (HW) exponential model for the prediction of spectrum occupancy data was investigated. Firstly, a description of spectrum measurement that was done during a two-week duration at locations (8.511 °N, 4.594 °E) and (8.487 °N, 4.573 °E) of the 900 MHz and 1800 MHz bands is given. In computing the spectrum duty cycle, different decision thresholds per band link were employed due to differing noise levels. A frequency point with a power spectral density less than the decision threshold was considered unoccupied and was assigned a value of 0, while a frequency point with a power spectral density larger than the decision threshold was considered occupied and was assigned a value of 1. Secondly, the spectrum duty cycle was used in the evaluation of the forecast behavior of the forecasting methods. The HW approach uses exponential smoothing to encode the spectrum data and uses them to forecast typical values in present and future states. The mean square error (MSE) of prediction was minimized using a GA by iteratively adjusting the HW discount factors to improve the forecast accuracy. A decrease in MSE of between 8.33 to 44.6% was observed

    Genetic Algorithm-Holt-Winters Based Minute Spectrum Occupancy Prediction: An Investigation

    Get PDF
    In this research, the suitability of a genetic algorithm (GA) modified Holt-Winters (HW) exponential model for the prediction of spectrum occupancy data was investigated. Firstly, a description of spectrum measurement that was done during a two-week duration at locations (8.511 °N, 4.594 °E) and (8.487 °N, 4.573 °E) of the 900 MHz and 1800 MHz bands is given. In computing the spectrum duty cycle, different decision thresholds per band link were employed due to differing noise levels. A frequency point with a power spectral density less than the decision threshold was considered unoccupied and was assigned a value of 0, while a frequency point with a power spectral density larger than the decision threshold was considered occupied and was assigned a value of 1. Secondly, the spectrum duty cycle was used in the evaluation of the forecast behavior of the forecasting methods. The HW approach uses exponential smoothing to encode the spectrum data and uses them to forecast typical values in present and future states. The mean square error (MSE) of prediction was minimized using a GA by iteratively adjusting the HW discount factors to improve the forecast accuracy. A decrease in MSE of between 8.33 to 44.6% was observed
    corecore