25 research outputs found
A cell outage management framework for dense heterogeneous networks
In this paper, we present a novel cell outage management (COM) framework for heterogeneous networks with split control and data planes-a candidate architecture for meeting future capacity, quality-of-service, and energy efficiency demands. In such an architecture, the control and data functionalities are not necessarily handled by the same node. The control base stations (BSs) manage the transmission of control information and user equipment (UE) mobility, whereas the data BSs handle UE data. An implication of this split architecture is that an outage to a BS in one plane has to be compensated by other BSs in the same plane. Our COM framework addresses this challenge by incorporating two distinct cell outage detection (COD) algorithms to cope with the idiosyncrasies of both data and control planes. The COD algorithm for control cells leverages the relatively larger number of UEs in the control cell to gather large-scale minimization-of-drive-test report data and detects an outage by applying machine learning and anomaly detection techniques. To improve outage detection accuracy, we also investigate and compare the performance of two anomaly-detecting algorithms, i.e., k-nearest-neighbor- and local-outlier-factor-based anomaly detectors, within the control COD. On the other hand, for data cell COD, we propose a heuristic Grey-prediction-based approach, which can work with the small number of UE in the data cell, by exploiting the fact that the control BS manages UE-data BS connectivity and by receiving a periodic update of the received signal reference power statistic between the UEs and data BSs in its coverage. The detection accuracy of the heuristic data COD algorithm is further improved by exploiting the Fourier series of the residual error that is inherent to a Grey prediction model. Our COM framework integrates these two COD algorithms with a cell outage compensation (COC) algorithm that can be applied to both planes. Our COC solution utilizes an actor-critic-based reinforcement learning algorithm, which optimizes the capacity and coverage of the identified outage zone in a plane, by adjusting the antenna gain and transmission power of the surrounding BSs in that plane. The simulation results show that the proposed framework can detect both data and control cell outage and compensate for the detected outage in a reliable manner
A Data-Driven Traffic Steering Algorithm for Optimizing User Experience in Multi-Tier LTE Networks
Multi-tier cellular networks are a cost-effective solution for capacity enhancement in urban scenarios. In these networks, effective mobility strategies are required to assign users to the most adequate layer. In this paper, a data-driven self-tuning algorithm for traffic steering is proposed to improve the overall Quality of Experience (QoE) in multi-carrier Long Term Evolution (LTE) networks. Traffic steering is achieved by changing Reference Signal Received Quality (RSRQ)-based inter-frequency handover margins. Unlike classical approaches considering cell-aggregated counters to drive the tuning process, the proposed algorithm relies
on a novel indicator, derived from connection traces, showing the impact of handovers on user QoE. Method assessment is carried out in a dynamic system-level simulator implementing a real multicarrier LTE scenario. Results show that the proposed algorithm significantly improves QoE figures obtained with classical load
balancing techniques.Spanish Ministry of Economy
and Competitiveness under Grant TEC2015-69982-R, in part by the Spanish
Ministry of Education, Culture and Sports under FPU Grant FPU17/04286, and
in part by the Horizon 2020 Project ONE5G under Grant ICT-76080
Cell identity allocation and optimisation of handover parameters in self-organised LTE femtocell networks
A thesis submitted to the University of Bedfordshire in partial ful lment of the requirements for the degree of Doctor of PhilosophyFemtocell is a small cellular base station used by operators to extend indoor service coverage and enhance overall network performance. In Long Term Evolution (LTE), femtocell works under macrocell coverage and combines with the macrocell to constitute the two-tier network. Compared to the traditional single-tier network, the two-tier scenario creates many new challenges, which lead to the 3rd Generation Partnership Project (3GPP) implementing an automation technology called Self-Organising Network (SON) in order to achieve lower cost and enhanced network performance.
This thesis focuses on the inbound and outbound handovers (handover between femtocell and macrocell); in detail, it provides suitable solutions for the intensity of femtocell handover prediction, Physical Cell Identity (PCI) allocation and handover triggering parameter optimisation. Moreover, those solutions are implemented in the structure of SON.
In order to e ciently manage radio resource allocation, this research investigates the conventional UE-based prediction model and proposes a cell-based prediction model to predict the intensity of a femtocell's handover, which overcomes the drawbacks of the conventional models in the two-tier scenario. Then, the predictor is used in the proposed dynamic group PCI allocation approach in order to solve the problem of PCI allocation for the femtocells.
In addition, based on SON, this approach is implemented in the structure of a centralised Automated Con guration of Physical Cell Identity (ACPCI). It overcomes the drawbacks of the conventional method by reducing inbound handover failure of Cell Global Identity (CGI). This thesis also tackles optimisation of the handover triggering parameters to minimise handover failure. A dynamic hysteresis-adjusting approach for each User Equipment (UE) is proposed, using received average Reference Signal-Signal to Interference plus Noise Ratio (RS-SINR) of the UE as a criterion. Furthermore, based on SON, this approach is implemented in the structure of hybrid Mobility Robustness Optimisation (MRO). It is able to off er the unique optimised hysteresis value to the individual UE in the network.
In order to evaluate the performance of the proposed approach against existing methods, a System Level Simulation (SLS) tool, provided by the Centre for Wireless Network Design (CWiND) research group, is utilised, which models the structure of two-tier communication of LTE femtocell-based networks
Solunvaihdon suorituskyvyn arviointi 450 MHz ja 2600 MHz LTE-verkkojen välillä
This thesis evaluates handover performance between two different LTE frequency bands, band 31 and band 38, which are operated on 450 MHz and 2600 MHz frequencies, respectively. Mobile network operators are deploying multiple LTE frequency bands within same geographical areas in order to meet demand created by continuously growing mobile data usage. This creates additional challenges to network design, performance optimization and mobility management. Studied bands 31 and 38 differ on their propagation characteristics, as well as on their specified transmission capabilities. Bands also utilize different duplex methods, Frequency Division Duplex and Time Division Duplex. Performance evaluation was conducted in order to allow efficient usage of both bands.
Evaluation is based on information obtained from 3GPP specifications and laboratory measurements conducted with commercially available equipment. Current handover parameters of the studied network have been optimized for 450 MHz cells only, and utilize mostly default configurations introduced by device manufacturer. This configuration is evaluated and more suitable handover strategy is proposed. The proposed strategy is then compared with the default strategy through measurements conducted in laboratory environment.
Conducted measurements confirm that with proper handover parameter optimization, 2600 MHz frequency band can be prioritized over less capable 450 MHz band, which is likely to improve user perceived service quality. By utilizing collected results, associated network operator could improve offered services and gain savings in network equipment costs.Tässä diplomityössä tutkitaan solunvaihdon suorituskykyä kahden LTE-taajuuskaistan, 31 ja 38, välillä. Taajuuskaistaa 31 operoidaan 450 MHz taajuudella ja taajuuskaistaa 38 2600 MHz taajuudella. Vastatakseen jatkuvaan mobiilidatan käytön kasvuun, verkko-operaattorit ottavat käyttöön useita LTE-taajuuksia saman maantieteellisen alueen sisällä. Tämä luo ylimääräisiä haasteita verkkosuunniteluun, verkon suorituskyvyn optimointiin ja mobiliteetin hallintaan. Tutkitut taajuuskaistat eroavat niin etenemis- kuin tiedonsiirtokyvyiltään. Lisäksi taajuuskaistat käyttävät erilaisia duplex-muotoja.
Suorituskyvyn arvioinnin tarkoitus on mahdollistaa molempien taajuuskaistojen tehokas käyttö. Suorituskyvyn arviointi perustuu 3GPP:n spesifikaatioihin ja kaupallisella laitteistolla suoritettuihin laboratoriomittauksiin. Nykyisin käytössä olevat verkkoparametrit on optimoitu vain 450 MHz solujen käyttöön, jonka lisäksi suuri osa verkon konfiguraatioista hyödyntää valmistan käyttämiä oletusarvoja. Työssä verkon konfiguraatiolla suoritetaan arviointi, jonka perusteella esitetään suositeltu solunvaihdon strategia. Suositeltua strategiaa verrataan oletus-strategiaan laboratoriomittausten avulla.
Mittaustulokset näyttävät toteen, että oikeanlaisilla solunvaihdon parametreilla 2600 MHz taajuuskaistaa voidaan priorisoida heikomman 450 MHz taajuuskaistan yli. Monissa tilanteissa tämä parantaa käyttäjien verkosta saamaa palvelukokemusta. Hyödyntämällä tämän työn tuottamia tuloksia, verkko-operaattori voi parantaa tarjoamaansa palvelua ja saavuttaa säästöjä laitehankinnoissa
Recommended from our members
Small cells deployment for traffic handling in centralized heterogeneous network
This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University LondonAs the next phase of mobile technology, 5G is coming with a new vision that is characterized by a connected society, in which everything will be effectively connected, providing a variety of services and diverse business models that require more than just higher data rates and more capacity to target new kinds of ultra-reliable and flexible connection. However, next generation of applications, services and use cases will have extreme variation in requirements which in turn amplified the demand on the network resources. Therefore, 5G will require a whole new design that take into consideration efficient resource management and utilisation. An observation that was made throughout this research refers to the demand for more capacity, reduced latency, and increased density as common factors of many of the next generation use cases. This inescapably implies that the use of small cells is an ideal solution for next generation applications requirements, provided that the necessary storage and computing resources need to be distributed closer to the actual user. In this context, this research proposed an architecture of a centralised heterogenous network, consisting of Macro and Small cells with storage and computing resources, all controlled by a centralized functionality embedded within a gateway at the edge of the network. Compared to the basic network, the proposed solutions have been proven to provide overall system performance enhancement. This involves extending the system by adding small cells to serve dedicated services for User Equipment (UE) with dual connectivity from local server which reduces the overall system delay while increasing the overall system throughput. The added centralized mobility management was proven to be capable of tracing the mobility of the UEs within the system coverage, by keeping one connection with the main cell while moving between small cells resulting in enhancement to the handover delay by 11% without service interruptions. Finally, the proposed slicing model demonstrated the system’s ability to provide different levels of services to users based on different Quality of Service (QoS) requirements and to differentiate between various applications without affecting the performance of other services, benefiting from more flexible infrastructure than the traditional network. In addition, a 50% improvement in the performance was observed in terms of the CPU utilization. In such architecture, the required capacity can be added exactly where it is needed and when it is needed, coverage problems can be directly addressed, higher throughput, lower latency, and efficient mobility management can be achieved as a result of efficient resource management and distribution which is one of key factors in the deployment of next generation mobile network system
SON for LTE-WLAN access network selection : design and performance
Mobile network operators (MNOs) are deploying carrier-grade Wireless Local Area Network (WLAN) as an important complementary system to cellular networks. Access network selection (ANS) between cellular and WLAN is an essential component to improve network performance and user quality-of-service (QoS) via controlled loading of these systems. In emerging heterogeneous networks characterized by different cell sizes and diverse WLAN deployments, automatic tuning of the network selection functionality plays a crucial role. In this article, we present two distinct Self-Organizing Network (SON) schemes for tuning the ANS between the Long-Term Evolution (LTE) and WLAN systems. The SON functions differ in terms of availability of inter-system information exchange and internal algorithm design for traffic load control. System level simulations in a site-specific dense urban network show that the proposed schemes improve significantly the user quality of service (QoS), and network capacity over the reference scheme when offloading to WLAN is performed simply based on signal coverage
Adaptive vehicular networking with Deep Learning
Vehicular networks have been identified as a key enabler for future smart traffic applications aiming to improve on-road safety, increase road traffic efficiency, or provide advanced infotainment services to improve on-board comfort. However, the requirements of smart traffic applications also place demands on vehicular networks’ quality in terms of high data rates, low latency, and reliability, while simultaneously meeting the challenges of sustainability, green network development goals and energy efficiency. The advances in vehicular communication technologies combined with the peculiar characteristics of vehicular networks have brought challenges to traditional networking solutions designed around fixed parameters using complex mathematical optimisation. These challenges necessitate greater intelligence to be embedded in vehicular networks to realise adaptive network optimisation. As such, one promising solution is the use of Machine Learning (ML) algorithms to extract hidden patterns from collected data thus formulating adaptive network optimisation solutions with strong generalisation capabilities.
In this thesis, an overview of the underlying technologies, applications, and characteristics of vehicular networks is presented, followed by the motivation of using ML and a general introduction of ML background. Additionally, a literature review of ML applications in vehicular networks is also presented drawing on the state-of-the-art of ML technology adoption. Three key challenging research topics have been identified centred around network optimisation and ML deployment aspects.
The first research question and contribution focus on mobile Handover (HO) optimisation as vehicles pass between base stations; a Deep Reinforcement Learning (DRL) handover algorithm is proposed and evaluated against the currently deployed method. Simulation results suggest that the proposed algorithm can guarantee optimal HO decision in a realistic simulation setup.
The second contribution explores distributed radio resource management optimisation. Two versions of a Federated Learning (FL) enhanced DRL algorithm are proposed and evaluated against other state-of-the-art ML solutions. Simulation results suggest that the proposed solution outperformed other benchmarks in overall resource utilisation efficiency, especially in generalisation scenarios.
The third contribution looks at energy efficiency optimisation on the network side considering a backdrop of sustainability and green networking. A cell switching algorithm was developed based on a Graph Neural Network (GNN) model and the proposed energy efficiency scheme is able to achieve almost 95% of the metric normalised energy efficiency compared against the “ideal” optimal energy efficiency benchmark and is capable of being applied in many more general network configurations compared with the state-of-the-art ML benchmark