6,138 research outputs found

    A survey of machine learning techniques applied to self organizing cellular networks

    Get PDF
    In this paper, a survey of the literature of the past fifteen years involving Machine Learning (ML) algorithms applied to self organizing cellular networks is performed. In order for future networks to overcome the current limitations and address the issues of current cellular systems, it is clear that more intelligence needs to be deployed, so that a fully autonomous and flexible network can be enabled. This paper focuses on the learning perspective of Self Organizing Networks (SON) solutions and provides, not only an overview of the most common ML techniques encountered in cellular networks, but also manages to classify each paper in terms of its learning solution, while also giving some examples. The authors also classify each paper in terms of its self-organizing use-case and discuss how each proposed solution performed. In addition, a comparison between the most commonly found ML algorithms in terms of certain SON metrics is performed and general guidelines on when to choose each ML algorithm for each SON function are proposed. Lastly, this work also provides future research directions and new paradigms that the use of more robust and intelligent algorithms, together with data gathered by operators, can bring to the cellular networks domain and fully enable the concept of SON in the near future

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig

    Unsupervised clustering for 5G network planning assisted by real data

    Get PDF
    The fifth-generation (5G) of networks is being deployed to provide a wide range of new services and to manage the accelerated traffic load of the existing networks. In the present-day networks, data has become more noteworthy than ever to infer about the traffic load and existing network infrastructure to minimize the cost of new 5G deployments. Identifying the region of highest traffic density in megabyte (MB) per km2 has an important implication in minimizing the cost per bit for the mobile network operators (MNOs). In this study, we propose a base station (BS) clustering framework based on unsupervised learning to identify the target area known as the highest traffic cluster (HTC) for 5G deployments. We propose a novel approach assisted by real data to determine the appropriate number of clusters k and to identify the HTC. The algorithm, named as NetClustering, determines the HTC and appropriate value of k by fulfilling MNO's requirements on the highest traffic density MB/km2 and the target deployment area in km2. To compare the appropriate value of k and other performance parameters, we use the Elbow heuristic as a benchmark. The simulation results show that the proposed algorithm fulfills the MNO's requirements on the target deployment area in km2 and highest traffic density MB/km2 with significant cost savings and achieves higher network utilization compared to the Elbow heuristic. In brief, the proposed algorithm provides a more meaningful interpretation of the underlying data in the context of clustering performed for network planningThis work was supported by the Spanish National Project IRENE-EARTH (PID2020-115323RB-C33/AEI/10.13039/501100011033
    • …
    corecore