268 research outputs found

    A PARADIGM SHIFTING APPROACH IN SON FOR FUTURE CELLULAR NETWORKS

    Get PDF
    The race to next generation cellular networks is on with a general consensus in academia and industry that massive densification orchestrated by self-organizing networks (SONs) is the cost-effective solution to the impending mobile capacity crunch. While the research on SON commenced a decade ago and is still ongoing, the current form (i.e., the reactive mode of operation, conflict-prone design, limited degree of freedom and lack of intelligence) hinders the current SON paradigm from meeting the requirements of 5G. The ambitious quality of experience (QoE) requirements and the emerging multifarious vision of 5G, along with the associated scale of complexity and cost, demand a significantly different, if not totally new, approach to SONs in order to make 5G technically as well as financially feasible. This dissertation addresses these limitations of state-of-the-art SONs. It first presents a generic low-complexity optimization framework to allow for the agile, on-line, multi-objective optimization of future mobile cellular networks (MCNs) through only top-level policy input that prioritizes otherwise conflicting key performance indicators (KPIs) such as capacity, QoE, and power consumption. The hybrid, semi-analytical approach can be used for a wide range of cellular optimization scenarios with low complexity. The dissertation then presents two novel, user-mobility, prediction-based, proactive self-optimization frameworks (AURORA and OPERA) to transform mobility from a challenge into an advantage. The proposed frameworks leverage mobility to overcome the inherent reactiveness of state-of-the-art self-optimization schemes to meet the extremely low latency and high QoE expected from future cellular networks vis-à-vis 5G and beyond. The proactiveness stems from the proposed frameworks’ novel capability of utilizing past hand-over (HO) traces to determine future cell loads instead of observing changes in cell loads passively and then reacting to them. A semi-Markov renewal process is leveraged to build a model that can predict the cell of the next HO and the time of the HO for the users. A low-complexity algorithm has been developed to transform the predicted mobility attributes to a user-coordinate level resolution. The learned knowledge base is used to predict the user distribution among cells. This prediction is then used to formulate a novel (i) proactive energy saving (ES) optimization problem (AURORA) that proactively schedules cell sleep cycles and (ii) proactive load balancing (LB) optimization problem (OPERA). The proposed frameworks also incorporate the effect of cell individual offset (CIO) for balancing the load among cells, and they thus exploit an additional ultra-dense network (UDN)-specific mechanism to ensure QoE while maximizing ES and/or LB. The frameworks also incorporates capacity and coverage constraints and a load-aware association strategy for ensuring the conflict-free operation of ES, LB, and coverage and capacity optimization (CCO) SON functions. Although the resulting optimization problems are combinatorial and NP-hard, proactive prediction of cell loads instead of reactive measurement allows ample time for combination of heuristics such as genetic programming and pattern search to find solutions with high ES and LB yields compared to the state of the art. To address the challenge of significantly higher cell outage rates in anticipated in 5G and beyond due to higher operational complexity and cell density than legacy networks, the dissertation’s fourth key contribution is a stochastic analytical model to analyze the effects of the arrival of faults on the reliability behavior of a cellular network. Assuming exponential distributions for failures and recovery, a reliability model is developed using the continuous-time Markov chains (CTMC) process. Unlike previous studies on network reliability, the proposed model is not limited to structural aspects of base stations (BSs), and it takes into account diverse potential fault scenarios; it is also capable of predicting the expected time of the first occurrence of the fault and the long-term reliability behavior of the BS. The contributions of this dissertation mark a paradigm shift from the reactive, semi-manual, sub-optimal SON towards a conflict-free, agile, proactive SON. By paving the way for future MCN’s commercial and technical viability, the new SON paradigm presented in this dissertation can act as a key enabler for next-generation MCNs

    Mobility management-based autonomous energy-aware framework using machine learning approach in dense mobile networks

    Get PDF
    A paramount challenge of prohibiting increased CO2 emissions for network densification is to deliver the Fifth Generation (5G) cellular capacity and connectivity demands, while maintaining a greener, healthier and prosperous environment. Energy consumption is a demanding consideration in the 5G era to combat several challenges such as reactive mode of operation, high latency wake up times, incorrect user association with the cells, multiple cross-functional operation of Self-Organising Networks (SON), etc. To address this challenge, we propose a novel Mobility Management-Based Autonomous Energy-Aware Framework for analysing bus passengers ridership through statistical Machine Learning (ML) and proactive energy savings coupled with CO2 emissions in Heterogeneous Network (HetNet) architecture using Reinforcement Learning (RL). Furthermore, we compare and report various ML algorithms using bus passengers ridership obtained from London Overground (LO) dataset. Extensive spatiotemporal simulations show that our proposed framework can achieve up to 98.82% prediction accuracy and CO2 reduction gains of up to 31.83%

    Mobility management in multi-RAT multiI-band heterogeneous networks

    Get PDF
    Support for user mobility is the raison d'etre of mobile cellular networks. However, mounting pressure for more capacity is leading to adaption of multi-band multi-RAT ultra-dense network design, particularly with the increased use of mmWave based small cells. While such design for emerging cellular networks is expected to offer manyfold more capacity, it gives rise to a new set of challenges in user mobility management. Among others, frequent handovers (HO) and thus higher impact of poor mobility management on quality of user experience (QoE) as well as link capacity, lack of an intelligent solution to manage dual connectivity (of user with both 4G and 5G cells) activation/deactivation, and mmWave cell discovery are the most critical challenges. In this dissertation, I propose and evaluate a set of solutions to address the aforementioned challenges. The beginning outcome of our investigations into the aforementioned problems is the first ever taxonomy of mobility related 3GPP defined network parameters and Key Performance Indicators (KPIs) followed by a tutorial on 3GPP-based 5G mobility management procedures. The first major contribution of the thesis here is a novel framework to characterize the relationship between the 28 critical mobility-related network parameters and 8 most vital KPIs. A critical hurdle in addressing all mobility related challenges in emerging networks is the complexity of modeling realistic mobility and HO process. Mathematical models are not suitable here as they cannot capture the dynamics as well as the myriad parameters and KPIs involved. Existing simulators also mostly either omit or overly abstract the HO and user mobility, chiefly because the problems caused by poor HO management had relatively less impact on overall performance in legacy networks as they were not multi-RAT multi-band and therefore incurred much smaller number of HOs compared to emerging networks. The second key contribution of this dissertation is development of a first of its kind system level simulator, called SyntheticNET that can help the research community in overcoming the hurdle of realistic mobility and HO process modeling. SyntheticNET is the very first python-based simulator that fully conforms to 3GPP Release 15 5G standard. Compared to the existing simulators, SyntheticNET includes a modular structure, flexible propagation modeling, adaptive numerology, realistic mobility patterns, and detailed HO evaluation criteria. SyntheticNET’s python-based platform allows the effective application of Artificial Intelligence (AI) to various network functionalities. Another key challenge in emerging multi-RAT technologies is the lack of an intelligent solution to manage dual connectivity with 4G as well 5G cell needed by a user to access 5G infrastructure. The 3rd contribution of this thesis is a solution to address this challenge. I present a QoE-aware E-UTRAN New Radio-Dual Connectivity (EN-DC) activation scheme where AI is leveraged to develop a model that can accurately predict radio link failure (RLF) and voice muting using the low-level measurements collected from a real network. The insights from the AI based RLF and mute prediction models are then leveraged to configure sets of 3GPP parameters to maximize EN-DC activation while keeping the QoE-affecting RLF and mute anomalies to minimum. The last contribution of this dissertation is a novel solution to address mmWave cell discovery problem. This problem stems from the highly directional nature of mmWave transmission. The proposed mmWave cell discovery scheme builds upon a joint search method where mmWave cells exploit an overlay coverage layer from macro cells sharing the UE location to the mmWave cell. The proposed scheme is made more practical by investigating and developing solutions for the data sparsity issue in model training. Ability to work with sparse data makes the proposed scheme feasible in realistic scenarios where user density is often not high enough to provide coverage reports from each bin of the coverage area. Simulation results show that the proposed scheme, efficiently activates EN-DC to a nearby mmWave 5G cell and thus substantially reduces the mmWave cell discovery failures compared to the state of the art cell discovery methods

    User mobility prediction and management using machine learning

    Get PDF
    The next generation mobile networks (NGMNs) are envisioned to overcome current user mobility limitations while improving the network performance. Some of the limitations envisioned for mobility management in the future mobile networks are: addressing the massive traffic growth bottlenecks; providing better quality and experience to end users; supporting ultra high data rates; ensuring ultra low latency, seamless handover (HOs) from one base station (BS) to another, etc. Thus, in order for future networks to manage users mobility through all of the stringent limitations mentioned, artificial intelligence (AI) is deemed to play a key role automating end-to-end process through machine learning (ML). The objectives of this thesis are to explore user mobility predictions and management use-cases using ML. First, background and literature review is presented which covers, current mobile networks overview, and ML-driven applications to enable user’s mobility and management. Followed by the use-cases of mobility prediction in dense mobile networks are analysed and optimised with the use of ML algorithms. The overall framework test accuracy of 91.17% was obtained in comparison to all other mobility prediction algorithms through artificial neural network (ANN). Furthermore, a concept of mobility prediction-based energy consumption is discussed to automate and classify user’s mobility and reduce carbon emissions under smart city transportation achieving 98.82% with k-nearest neighbour (KNN) classifier as an optimal result along with 31.83% energy savings gain. Finally, context-aware handover (HO) skipping scenario is analysed in order to improve over all quality of service (QoS) as a framework of mobility management in next generation networks (NGNs). The framework relies on passenger mobility, trains trajectory, travelling time and frequency, network load and signal ratio data in cardinal directions i.e, North, East, West, and South (NEWS) achieving optimum result of 94.51% through support vector machine (SVM) classifier. These results were fed into HO skipping techniques to analyse, coverage probability, throughput, and HO cost. This work is extended by blockchain-enabled privacy preservation mechanism to provide end-to-end secure platform throughout train passengers mobility

    The role of artificial intelligence driven 5G networks in COVID-19 outbreak: opportunities, challenges, and future outlook

    Get PDF
    There is no doubt that the world is currently experiencing a global pandemic that is reshaping our daily lives as well as the way business activities are being conducted. With the emphasis on social distancing as an effective means of curbing the rapid spread of the infection, many individuals, institutions, and industries have had to rely on telecommunications as a means of ensuring service continuity in order to prevent complete shutdown of their operations. This has put enormous pressure on both fixed and mobile networks. Though fifth generation mobile networks (5G) is at its infancy in terms of deployment, it possesses a broad category of services including enhanced mobile broadband (eMBB), ultra-reliable low-latency communications (URLLC), and massive machine-type communications (mMTC), that can help in tackling pandemic-related challenges. Therefore, in this paper, we identify the challenges facing existing networks due to the surge in traffic demand as a result of the COVID-19 pandemic and emphasize the role of 5G empowered by artificial intelligence in tackling these problems. In addition, we also provide a brief insight on the use of artificial intelligence driven 5G networks in predicting future pandemic outbreaks, and the development a pandemic-resilient society in case of future outbreaks

    Mobility Prediction-Based Optimisation and Encryption of Passenger Traffic-Flows Using Machine Learning

    Get PDF
    Information and Communication Technology (ICT) enabled optimisation of train’s passenger traffic flows is a key consideration of transportation under Smart City planning (SCP). Traditional mobility prediction based optimisation and encryption approaches are reactive in nature; however, Artificial Intelligence (AI) driven proactive solutions are required for near real-time optimisation. Leveraging the historical passenger data recorded via Radio Frequency Identification (RFID) sensors installed at the train stations, mobility prediction models can be developed to support and improve the railway operational performance vis-a-vis 5G and beyond. In this paper we have analysed the passenger traffic flows based on an Access, Egress and Interchange (AEI) framework to support train infrastructure against congestion, accidents, overloading carriages and maintenance. This paper predominantly focuses on developing passenger flow predictions using Machine Learning (ML) along with a novel encryption model that is capable of handling the heavy passenger traffic flow in real-time. We have compared and reported the performance of various ML driven flow prediction models using real-world passenger flow data obtained from London Underground and Overground (LUO). Extensive spatio-temporal simulations leveraging realistic mobility prediction models show that an AEI framework can achieve 91.17% prediction accuracy along with secure and light-weight encryption capabilities. Security parameters such as correlation coefficient (7.70), number of pixel change rate (>99%), unified average change intensity (>33), contrast (>10), homogeneity

    A New Paradigm for Proactive Self-Healing in Future Self-Organizing Mobile Cellular Networks

    Get PDF
    Mobile cellular network operators spend nearly a quarter of their revenue on network management and maintenance. Remarkably, a significant proportion of that budget is spent on resolving outages that degrade or disrupt cellular services. Historically, operators have mainly relied on human expertise to identify, diagnose and resolve such outages while also compensating for them in the short-term. However, with ambitious quality of experience expectations from 5th generation and beyond mobile cellular networks spurring research towards technologies such as ultra-dense heterogeneous networks and millimeter wave spectrum utilization, discovering and compensating coverage lapses in future networks will be a major challenge. Numerous studies have explored heuristic, analytical and machine learning-based solutions to autonomously detect, diagnose and compensate cell outages in legacy mobile cellular networks, a branch of research known as self-healing. This dissertation focuses on self-healing techniques for future mobile cellular networks, with special focus on outage detection and avoidance components of self-healing. Network outages can be classified into two primary types: 1) full and 2) partial. Full outages result from failed soft or hard components of network entities while partial outages are generally a consequence of parametric misconfiguration. To this end, chapter 2 of this dissertation is dedicated to a detailed survey of research on detecting, diagnosing and compensating full outages as well as a detailed analysis of studies on proactive outage avoidance schemes and their challenges. A key observation from the analysis of the state-of-the-art outage detection techniques is their dependence on full network coverage data, susceptibility to noise or randomness in the data and inability to characterize outages in both spacial domain and temporal domain. To overcome these limitations, chapters 3 and 4 present two unique and novel outage detection techniques. Chapter 3 presents an outage detection technique based on entropy field decomposition which combines information field theory and entropy spectrum pathways theory and is robust to noise variance. Chapter 4 presents a deep learning neural network algorithm which is robust to data sparsity and compares it with entropy field decomposition and other state-of-the-art machine learning-based outage detection algorithms including support vector machines, K-means clustering, independent component analysis and deep auto-encoders. Based on the insights obtained regarding the impact of partial outages, chapter 5 presents a complete framework for 5th generation and beyond mobile cellular networks that is designed to avoid partial outages caused by parametric misconfiguration. The power of the proposed framework is demonstrated by leveraging it to design a solution that tackles one of the most common problems associated with ultra-dense heterogeneous networks, namely imbalanced load among small and macro cells, and poor resource utilization as a consequence. The optimization problem is formulated as a function of two hard parameters namely antenna tilt and transmit power, and a soft parameter, cell individual offset, that affect the coverage, capacity and load directly. The resulting solution is a combination of the otherwise conflicting coverage and capacity optimization and load balancing self-organizing network functions

    Cognitive networking for next generation of cellular communication systems

    Get PDF
    This thesis presents a comprehensive study of cognitive networking for cellular networks with contributions that enable them to be more dynamic, agile, and efficient. To achieve this, machine learning (ML) algorithms, a subset of artificial intelligence, are employed to bring such cognition to cellular networks. More specifically, three major branches of ML, namely supervised, unsupervised, and reinforcement learning (RL), are utilised for various purposes: unsupervised learning is used for data clustering, while supervised learning is employed for predictions on future behaviours of networks/users. RL, on the other hand, is utilised for optimisation purposes due to its inherent characteristics of adaptability and requiring minimal knowledge of the environment. Energy optimisation, capacity enhancement, and spectrum access are identified as primary design challenges for cellular networks given that they are envisioned to play crucial roles for 5G and beyond due to the increased demand in the number of connected devices as well as data rates. Each design challenge and its corresponding proposed solution are discussed thoroughly in separate chapters. Regarding energy optimisation, a user-side energy consumption is investigated by considering Internet of things (IoT) networks. An RL based intelligent model, which jointly optimises the wireless connection type and data processing entity, is proposed. In particular, a Q-learning algorithm is developed, through which the energy consumption of an IoT device is minimised while keeping the requirement of the applications--in terms of response time and security--satisfied. The proposed methodology manages to result in 0% normalised joint cost--where all the considered metrics are combined--while the benchmarks performed 54.84% on average. Next, the energy consumption of radio access networks (RANs) is targeted, and a traffic-aware cell switching algorithm is designed to reduce the energy consumption of a RAN without compromising on the user quality-of-service (QoS). The proposed technique employs a SARSA algorithm with value function approximation, since the conventional RL methods struggle with solving problems with huge state spaces. The results reveal that up to 52% gain on the total energy consumption is achieved with the proposed technique, and the gain is observed to reduce when the scenario becomes more realistic. On the other hand, capacity enhancement is studied from two different perspectives, namely mobility management and unmanned aerial vehicle (UAV) assistance. Towards that end, a predictive handover (HO) mechanism is designed for mobility management in cellular networks by identifying two major issues of Markov chains based HO predictions. First, revisits--which are defined as a situation whereby a user visits the same cell more than once within the same day--are diagnosed as causing similar transition probabilities, which in turn increases the likelihood of making incorrect predictions. This problem is addressed with a structural change; i.e., rather than storing 2-D transition matrix, it is proposed to store 3-D one that also includes HO orders. The obtained results show that 3-D transition matrix is capable of reducing the HO signalling cost by up to 25.37%, which is observed to drop with increasing randomness level in the data set. Second, making a HO prediction with insufficient criteria is identified as another issue with the conventional Markov chains based predictors. Thus, a prediction confidence level is derived, such that there should be a lower bound to perform HO predictions, which are not always advantageous owing to the HO signalling cost incurred from incorrect predictions. The outcomes of the simulations confirm that the derived confidence level mechanism helps in improving the prediction accuracy by up to 8.23%. Furthermore, still considering capacity enhancement, a UAV assisted cellular networking is considered, and an unsupervised learning-based UAV positioning algorithm is presented. A comprehensive analysis is conducted on the impacts of the overlapping footprints of multiple UAVs, which are controlled by their altitudes. The developed k-means clustering based UAV positioning approach is shown to reduce the number of users in outage by up to 80.47% when compared to the benchmark symmetric deployment. Lastly, a QoS-aware dynamic spectrum access approach is developed in order to tackle challenges related to spectrum access, wherein all the aforementioned types of ML methods are employed. More specifically, by leveraging future traffic load predictions of radio access technologies (RATs) and Q-learning algorithm, a novel proactive spectrum sensing technique is introduced. As such, two different sensing strategies are developed; the first one focuses solely on sensing latency reduction, while the second one jointly optimises sensing latency and user requirements. In particular, the proposed Q-learning algorithm takes the future load predictions of the RATs and the requirements of secondary users--in terms of mobility and bandwidth--as inputs and directs the users to the spectrum of the optimum RAT to perform sensing. The strategy to be employed can be selected based on the needs of the applications, such that if the latency is the only concern, the first strategy should be selected due to the fact that the second strategy is computationally more demanding. However, by employing the second strategy, sensing latency is reduced while satisfying other user requirements. The simulation results demonstrate that, compared to random sensing, the first strategy decays the sensing latency by 85.25%, while the second strategy enhances the full-satisfaction rate, where both mobility and bandwidth requirements of the user are simultaneously satisfied, by 95.7%. Therefore, as it can be observed, three key design challenges of the next generation of cellular networks are identified and addressed via the concept of cognitive networking, providing a utilitarian tool for mobile network operators to plug into their systems. The proposed solutions can be generalised to various network scenarios owing to the sophisticated ML implementations, which renders the solutions both practical and sustainable
    • …
    corecore