320 research outputs found

    Cell fault management using machine learning techniques

    Get PDF
    This paper surveys the literature relating to the application of machine learning to fault management in cellular networks from an operational perspective. We summarise the main issues as 5G networks evolve, and their implications for fault management. We describe the relevant machine learning techniques through to deep learning, and survey the progress which has been made in their application, based on the building blocks of a typical fault management system. We review recent work to develop the abilities of deep learning systems to explain and justify their recommendations to network operators. We discuss forthcoming changes in network architecture which are likely to impact fault management and offer a vision of how fault management systems can exploit deep learning in the future. We identify a series of research topics for further study in order to achieve this

    Transform-Based Multiresolution Decomposition for Degradation Detection in Cellular Networks

    Get PDF
    Anomaly detection in the performance of the huge number of elements that are part of cellular networks (base stations, core entities, and user equipment) is one of the most time consuming and key activities for supporting failure management procedures and ensuring the required performance of the telecommunication services. This activity originally relied on direct human inspection of cellular metrics (counters, key performance indicators, etc.). Currently, degradation detection procedures have experienced an evolution towards the use of automatic mechanisms of statistical analysis and machine learning. However, pre-existent solutions typically rely on the manual definition of the values to be considered abnormal or on large sets of labeled data, highly reducing their performance in the presence of long-term trends in the metrics or previously unknown patterns of degradation. In this field, the present work proposes a novel application of transform-based analysis, using wavelet transform, for the detection and study of network degradations. The proposed system is tested using cell-level metrics obtained from a real-world LTE cellular network, showing its capabilities to detect and characterize anomalies of different patterns and in the presence of varied temporal trends. This is performed without the need for manually establishing normality thresholds and taking advantage of wavelet transform capabilities to separate the metrics in multiple time-frequency components. Our results show how direct statistical analysis of these components allows for a successful detection of anomalies beyond the capabilities of detection of previous methods.Optimi-EricssonJunta de AndaluciaEuropean Union (EU) 59288Proyecto de Investigacion de Excelencia P12-TIC-2905project IDADE-5G UMA18-FEDERJA-201European Union (EU) ICT-76080

    Self-Organizing Networks use cases in commercial deployments

    Get PDF
    These measurements can be obtained from different sources, but these sources are either expensive or not applicable to any network. To solve this problem, this thesis proposes a method that uses information available in any network so that the calibration of predictive maps is converted into universal without losing accuracy with respect to current methods. Furthermore, the complexity of today's networks makes them prone to failure. To save costs, operators employ network self-healing techniques so that networks are able to self-diagnose and even self-fix when possible. Among the various failures that can occur in mobile communication networks, a common case is the existence of sectors whose radiated signal has been exchanged. This issue appears during the network roll-out when engineers accidentally cross feeders of several antennas. Currently, manual methodology is used to identify this problem. Therefore, this thesis presents an automatic system to detect these cases. Finally, special attention has been paid to the computational efficiency of the algorithms developed in this thesis since they have finally been integrated into commercial tools.Ince their origins, mobile communication networks have undergone major changes imposed by the need for networks to adapt to user demand. To do this, networks have had to increase in complexity. In turn, complexity has made networks increasingly difficult to design and maintain. To mitigate the impact of network complexity, the concept of self-organizing networks (SON) emerged. Self-organized networks aim at reducing the complexity in the design and maintenance of mobile communication networks by automating processes. Thus, three major blocks in the automation of networks are identified: self-configuration, self-optimization and self-healing. This thesis contributes to the state of the art of self-organized networks through the identification and subsequent resolution of a problem in each of the three blocks into which they are divided. With the advent of 5G networks and the speeds they promise to deliver to users, new use cases have emerged. One of these use cases is known as Fixed Wireless Access. In this type of network, the last mile of fiber is replaced by broadband radio access of mobile technologies. Until now, regarding self-configuration, greenfield design methodologies for wireless networks based on mobile communication technologies are based on the premise that users have mobility characteristics. However, in fixed wireless access networks, the antennas of the users are in fixed locations. Therefore, this thesis proposes a novel methodology for finding the optimal locations were to deploy network equipment as well as the configuration of their radio parameters in Fixed Wireless Access networks. Regarding self-optimization of networks, current algorithms make use of signal maps of the cells in the network so that the changes that these maps would experience after modifying any network parameter can be estimated. In order to obtain these maps, operators use predictive models calibrated through real network measurements

    Investigating the Effects of Network Dynamics on Quality of Delivery Prediction and Monitoring for Video Delivery Networks

    Get PDF
    Video streaming over the Internet requires an optimized delivery system given the advances in network architecture, for example, Software Defined Networks. Machine Learning (ML) models have been deployed in an attempt to predict the quality of the video streams. Some of these efforts have considered the prediction of Quality of Delivery (QoD) metrics of the video stream in an effort to measure the quality of the video stream from the network perspective. In most cases, these models have either treated the ML algorithms as black-boxes or failed to capture the network dynamics of the associated video streams. This PhD investigates the effects of network dynamics in QoD prediction using ML techniques. The hypothesis that this thesis investigates is that ML techniques that model the underlying network dynamics achieve accurate QoD and video quality predictions and measurements. The thesis results demonstrate that the proposed techniques offer performance gains over approaches that fail to consider network dynamics. This thesis results highlight that adopting the correct model by modelling the dynamics of the network infrastructure is crucial to the accuracy of the ML predictions. These results are significant as they demonstrate that improved performance is achieved at no additional computational or storage cost. These techniques can help the network manager, data center operatives and video service providers take proactive and corrective actions for improved network efficiency and effectiveness

    Optimization of Mobility Parameters using Fuzzy Logic and Reinforcement Learning in Self-Organizing Networks

    Get PDF
    In this thesis, several optimization techniques for next-generation wireless networks are proposed to solve different problems in the field of Self-Organizing Networks and heterogeneous networks. The common basis of these problems is that network parameters are automatically tuned to deal with the specific problem. As the set of network parameters is extremely large, this work mainly focuses on parameters involved in mobility management. In addition, the proposed self-tuning schemes are based on Fuzzy Logic Controllers (FLC), whose potential lies in the capability to express the knowledge in a similar way to the human perception and reasoning. In addition, in those cases in which a mathematical approach has been required to optimize the behavior of the FLC, the selected solution has been Reinforcement Learning, since this methodology is especially appropriate for learning from interaction, which becomes essential in complex systems such as wireless networks. Taking this into account, firstly, a new Mobility Load Balancing (MLB) scheme is proposed to solve persistent congestion problems in next-generation wireless networks, in particular, due to an uneven spatial traffic distribution, which typically leads to an inefficient usage of resources. A key feature of the proposed algorithm is that not only the parameters are optimized, but also the parameter tuning strategy. Secondly, a novel MLB algorithm for enterprise femtocells scenarios is proposed. Such scenarios are characterized by the lack of a thorough deployment of these low-cost nodes, meaning that a more efficient use of radio resources can be achieved by applying effective MLB schemes. As in the previous problem, the optimization of the self-tuning process is also studied in this case. Thirdly, a new self-tuning algorithm for Mobility Robustness Optimization (MRO) is proposed. This study includes the impact of context factors such as the system load and user speed, as well as a proposal for coordination between the designed MLB and MRO functions. Fourthly, a novel self-tuning algorithm for Traffic Steering (TS) in heterogeneous networks is proposed. The main features of the proposed algorithm are the flexibility to support different operator policies and the adaptation capability to network variations. Finally, with the aim of validating the proposed techniques, a dynamic system-level simulator for Long-Term Evolution (LTE) networks has been designed
    • …
    corecore