2,789 research outputs found

    A survey of machine learning techniques applied to self organizing cellular networks

    Get PDF
    In this paper, a survey of the literature of the past fifteen years involving Machine Learning (ML) algorithms applied to self organizing cellular networks is performed. In order for future networks to overcome the current limitations and address the issues of current cellular systems, it is clear that more intelligence needs to be deployed, so that a fully autonomous and flexible network can be enabled. This paper focuses on the learning perspective of Self Organizing Networks (SON) solutions and provides, not only an overview of the most common ML techniques encountered in cellular networks, but also manages to classify each paper in terms of its learning solution, while also giving some examples. The authors also classify each paper in terms of its self-organizing use-case and discuss how each proposed solution performed. In addition, a comparison between the most commonly found ML algorithms in terms of certain SON metrics is performed and general guidelines on when to choose each ML algorithm for each SON function are proposed. Lastly, this work also provides future research directions and new paradigms that the use of more robust and intelligent algorithms, together with data gathered by operators, can bring to the cellular networks domain and fully enable the concept of SON in the near future

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial

    Mobility management in 5G heterogeneous networks

    Get PDF
    In recent years, mobile data traffic has increased exponentially as a result of widespread popularity and uptake of portable devices, such as smartphones, tablets and laptops. This growth has placed enormous stress on network service providers who are committed to offering the best quality of service to consumer groups. Consequently, telecommunication engineers are investigating innovative solutions to accommodate the additional load offered by growing numbers of mobile users. The fifth generation (5G) of wireless communication standard is expected to provide numerous innovative solutions to meet the growing demand of consumer groups. Accordingly the ultimate goal is to achieve several key technological milestones including up to 1000 times higher wireless area capacity and a significant cut in power consumption. Massive deployment of small cells is likely to be a key innovation in 5G, which enables frequent frequency reuse and higher data rates. Small cells, however, present a major challenge for nodes moving at vehicular speeds. This is because the smaller coverage areas of small cells result in frequent handover, which leads to lower throughput and longer delay. In this thesis, a new mobility management technique is introduced that reduces the number of handovers in a 5G heterogeneous network. This research also investigates techniques to accommodate low latency applications in nodes moving at vehicular speeds

    Latency reduction by dynamic channel estimator selection in C-RAN networks using fuzzy logic

    Get PDF
    Due to a dramatic increase in the number of mobile users, operators are forced to expand their networks accordingly. Cloud Radio Access Network (C-RAN) was introduced to tackle the problems of the current generation of mobile networks and to support future 5G networks. However, many challenges have arisen through the centralised structure of C-RAN. The accuracy of the channel state information acquisition in the C-RAN for large numbers of remote radio heads and user equipment is one of the main challenges in this architecture. In order to minimize the time required to acquire the channel information in C-RAN and to reduce the end-to-end latency, in this paper a dynamic channel estimator selection algorithm is proposed. The idea is to assign different channel estimation algorithms to the users of mobile networks based on their link status (particularly the SNR threshold). For the purpose of automatic and adaptive selection to channel estimators, a fuzzy logic algorithm is employed as a decision maker to select the best SNR threshold by utilising the bit error rate measurements. The results demonstrate a reduction in the estimation time with low loss in data throughput. It is also observed that the outcome of the proposed algorithm increases at high SNR values
    • …
    corecore