348 research outputs found

    A cell outage management framework for dense heterogeneous networks

    Get PDF
    In this paper, we present a novel cell outage management (COM) framework for heterogeneous networks with split control and data planes-a candidate architecture for meeting future capacity, quality-of-service, and energy efficiency demands. In such an architecture, the control and data functionalities are not necessarily handled by the same node. The control base stations (BSs) manage the transmission of control information and user equipment (UE) mobility, whereas the data BSs handle UE data. An implication of this split architecture is that an outage to a BS in one plane has to be compensated by other BSs in the same plane. Our COM framework addresses this challenge by incorporating two distinct cell outage detection (COD) algorithms to cope with the idiosyncrasies of both data and control planes. The COD algorithm for control cells leverages the relatively larger number of UEs in the control cell to gather large-scale minimization-of-drive-test report data and detects an outage by applying machine learning and anomaly detection techniques. To improve outage detection accuracy, we also investigate and compare the performance of two anomaly-detecting algorithms, i.e., k-nearest-neighbor- and local-outlier-factor-based anomaly detectors, within the control COD. On the other hand, for data cell COD, we propose a heuristic Grey-prediction-based approach, which can work with the small number of UE in the data cell, by exploiting the fact that the control BS manages UE-data BS connectivity and by receiving a periodic update of the received signal reference power statistic between the UEs and data BSs in its coverage. The detection accuracy of the heuristic data COD algorithm is further improved by exploiting the Fourier series of the residual error that is inherent to a Grey prediction model. Our COM framework integrates these two COD algorithms with a cell outage compensation (COC) algorithm that can be applied to both planes. Our COC solution utilizes an actor-critic-based reinforcement learning algorithm, which optimizes the capacity and coverage of the identified outage zone in a plane, by adjusting the antenna gain and transmission power of the surrounding BSs in that plane. The simulation results show that the proposed framework can detect both data and control cell outage and compensate for the detected outage in a reliable manner

    A survey of self organisation in future cellular networks

    Get PDF
    This article surveys the literature over the period of the last decade on the emerging field of self organisation as applied to wireless cellular communication networks. Self organisation has been extensively studied and applied in adhoc networks, wireless sensor networks and autonomic computer networks; however in the context of wireless cellular networks, this is the first attempt to put in perspective the various efforts in form of a tutorial/survey. We provide a comprehensive survey of the existing literature, projects and standards in self organising cellular networks. Additionally, we also aim to present a clear understanding of this active research area, identifying a clear taxonomy and guidelines for design of self organising mechanisms. We compare strength and weakness of existing solutions and highlight the key research areas for further development. This paper serves as a guide and a starting point for anyone willing to delve into research on self organisation in wireless cellular communication networks

    Distributed Cognitive RAT Selection in 5G Heterogeneous Networks: A Machine Learning Approach

    Get PDF
    The leading role of the HetNet (Heterogeneous Networks) strategy as the key Radio Access Network (RAN) architecture for future 5G networks poses serious challenges to the current cell selection mechanisms used in cellular networks. The max-SINR algorithm, although effective historically for performing the most essential networking function of wireless networks, is inefficient at best and obsolete at worst in 5G HetNets. The foreseen embarrassment of riches and diversified propagation characteristics of network attachment points spanning multiple Radio Access Technologies (RAT) requires novel and creative context-aware system designs. The association and routing decisions, in the context of single-RAT or multi-RAT connections, need to be optimized to efficiently exploit the benefits of the architecture. However, the high computational complexity required for multi-parametric optimization of utility functions, the difficulty of modeling and solving Markov Decision Processes, the lack of guarantees of stability of Game Theory algorithms, and the rigidness of simpler methods like Cell Range Expansion and operator policies managed by the Access Network Discovery and Selection Function (ANDSF), makes neither of these state-of-the-art approaches a favorite. This Thesis proposes a framework that relies on Machine Learning techniques at the terminal device-level for Cognitive RAT Selection. The use of cognition allows the terminal device to learn both a multi-parametric state model and effective decision policies, based on the experience of the device itself. This implies that a terminal, after observing its environment during a learning period, may formulate a system characterization and optimize its own association decisions without any external intervention. In our proposal, this is achieved through clustering of appropriately defined feature vectors for building a system state model, supervised classification to obtain the current system state, and reinforcement learning for learning good policies. This Thesis describes the above framework in detail and recommends adaptations based on the experimentation with the X-means, k-Nearest Neighbors, and Q-learning algorithms, the building blocks of the solution. The network performance of the proposed framework is evaluated in a multi-agent environment implemented in MATLAB where it is compared with alternative RAT selection mechanisms

    Performance Enhancing of Heterogeneous Network through Optimisation and Machine Learning Techniques

    Get PDF
    In the last two decades, by the benefit of advanced wireless technology, growing data service cause the explosive traffic demand, and it brings many new challenges to the network operators. In order to match the growing traffic demand, operators shall deploy new base stations to increase the total cellular network capacity. Meanwhile, a new type of low-power base stations are frequently deployed within the network, providing extra access points to subscribers. However, even the new base station can be operated in low power, the total network energy consumption is still increased proportional to the total number of base station, and considerable network energy consumption will become one of the main issues to the network operators. The way of reducing network energy consumption become crucial, especially in 5G when multiple antennas are deployed within one site. However, the base station cannot be always operated in low power because it will damage the network performance, and power can be only reduced in light-traffic period. Therefore, the way of balancing traffic demand and energy consumption will be come the main investigation direction in this thesis, and how to link the operated power of base station to the current traffic demand is investigated. In this thesis, algorithms and optimisations are utilised to reduce the network energy consumption and improve the network performance. To reduce the energy consumption in light-traffic period, base stations switch-off strategy is proposed in the first chapter. However, the network performance should be carefully estimated before the switch-off strategy is applied. The NP-hard energy efficiency optimisation problem is summarised, and it proposes the method that some of the base stations can be grouped together due to the limited interference from other Pico cells, reducing the complexity of the optimisation problem. Meanwhile, simulated annealing is proposed to obtain the optimal base stations combination to achieve optimal energy efficiency. By the optimisation algorithm, it can obtain the optimal PCs combination without scarifying the overall network throughput. The simulation results show that not only the energy consumption can be reduced but also the significant energy efficiency improvement can achieve by the switched-off strategy. The average energy efficiency improvement over thirty simulation is 17.06%. The second chapter will tackle the issue of how to raise the power of base stations after they are switched off. These base stations shall back to regular power level to prepare the incoming traffic. However, not all base stations shall be back to normal power due to the uneven traffic distribution. By analysing the information within the collected subscriber data, such as moving speed, direction, downlink and time, Naive Bayesian classifier will be utilised to obtain the user movement pattern and predict the future traffic distribution, and the system can know which base station will become the user's destination. The load adaptive power control is utilised to inform the corresponding base stations to increased the transmission power, base stations can prepare for the incoming traffic, avoiding the performance degradation. The simulation results show that the machine learning can accurately predict the destination of the subscriber, achieving average 90.8% accuracy among thirty simulation. The network energy can be saved without damage the network performance after the load adaptive function is applied, the average energy efficiency improvement among three scenarios is 4.3%, the improvement is significant. The significant improvement prove that the proposed machine learning and load adaptive power modification method can help the network reduce the energy consumption. In the last chapter, it will utilise cell range expansion to tackle the resources issue in cooperative base station in joint transmission, improving downlink performance and tackle the cell-edge problem. Due to the uneven traffic distribution, it will cause the insufficient resources problem in cooperative base station in joint transmission, and the system throughput will be influenced if cooperative base station executes joint transmission in high load. Therefore, the cell range expansion is utilised to solve the problem of unbalanced traffic between base station tier, and flow water algorithm is utilised to tackle the resources distribution issue during the traffic offloading. The simulation shows the NP-hard problem can be sufficiently solved by the flow water algorithm, and the downlink throughput gain can be obtained, it can obtain 26% gain in the M-P scenario, and the gain in P-M scenario is 24%. The result prove that the proposed method can provide significant gain to the subscriber without losing any total network throughput

    A Comprehensive Survey of the Tactile Internet: State of the art and Research Directions

    Get PDF
    The Internet has made several giant leaps over the years, from a fixed to a mobile Internet, then to the Internet of Things, and now to a Tactile Internet. The Tactile Internet goes far beyond data, audio and video delivery over fixed and mobile networks, and even beyond allowing communication and collaboration among things. It is expected to enable haptic communication and allow skill set delivery over networks. Some examples of potential applications are tele-surgery, vehicle fleets, augmented reality and industrial process automation. Several papers already cover many of the Tactile Internet-related concepts and technologies, such as haptic codecs, applications, and supporting technologies. However, none of them offers a comprehensive survey of the Tactile Internet, including its architectures and algorithms. Furthermore, none of them provides a systematic and critical review of the existing solutions. To address these lacunae, we provide a comprehensive survey of the architectures and algorithms proposed to date for the Tactile Internet. In addition, we critically review them using a well-defined set of requirements and discuss some of the lessons learned as well as the most promising research directions

    D13.2 Techniques and performance analysis on energy- and bandwidth-efficient communications and networking

    Get PDF
    Deliverable D13.2 del projecte europeu NEWCOM#The report presents the status of the research work of the various Joint Research Activities (JRA) in WP1.3 and the results that were developed up to the second year of the project. For each activity there is a description, an illustration of the adherence to and relevance with the identified fundamental open issues, a short presentation of the main results, and a roadmap for the future joint research. In the Annex, for each JRA, the main technical details on specific scientific activities are described in detail.Peer ReviewedPostprint (published version

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial

    Air Interface for Next Generation Mobile Communication Networks: Physical Layer Design:A LTE-A Uplink Case Study

    Get PDF
    corecore