56 research outputs found

    A data-driven user steering algorithm for optimizing user experience in multi-tier LTE networks

    Get PDF
    Multi-tier cellular networks are a cost-effective solution for capacity enhancement in urban scenarios. In these networks, effective handover schemes are required to assign users to the most adequate layer. In this paper, a data-driven self-tuning algorithm for user steering is proposed to improve the overall Quality of Experience (QoE) in multi-carrier Long Term Evolution (TE) networks. Unlike classical approaches, user steering is achieved by changing Reference Signal Received Quality (RSRQ) based inter-frequency handover margins. To drive the tuning process, a novel indicator showing throughput changes in the vicinity of handovers is derived from connection traces. Method assessment is carried out in a dynamic system-level LTE simulator implementing a real multi-carrier scenario. Results show that the proposed algorithm significantly improves QoE figures obtained with a classical inter-frequency handover scheme based on Reference Signal Received Power (RSRP) measurements.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech

    Modelling Slice Performance in Radio Access Networks through Supervised Learning

    Get PDF
    In 5G systems, the Network Slicing (NS) feature allows to deploy several logical networks customized for specific verticals over a common physical infrastructure. To make the most of this feature, cellular operators need models reflecting performance at slice level for re-dimensioning the Radio Access Network (RAN). Throughput is often regarded as a key perfor- mance metric due its strong impact on users demanding enhanced mobility broadband services. In this work, we present the first comprehensive analysis tackling slice throughput estimation in the down link of RAN-sliced networks through Supervised Learning (SL), based on information collected in the operations support system. The considered SL algorithms include support vector regression, k-nearest neighbors, ensemble methods based on decision trees and neural networks. All these algorithms are tested in two NS scenarios with single-service and multi-service slices. To this end, synthetic datasets with performance indicators and connection traces are generated with a system-level simulator emulating the activity of a live cellular network. Results show that the best model (i.e., combination of SL algorithm and input features) to estimate slice throughput may vary depending on the NS scenario. In all cases, the best models have shown adequate accuracy(i.e., error below 10%).Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech

    Un nuevo criterio basado en calidad de experiencia para el balance de carga en redes LTE

    Get PDF
    The increase in traffic and services in mobile networks has made network management a very complex task. This fact has motivated the development of many algorithms in a Self-Organized Network (SON) framework, such as Mobility Load Balancing (MLB). MLB achieves to solve congestion problems by sharing traffic demand among neighbour cells through the modification of handover parameters. However, it presents some limitations in current LTE networks. These limitations have a negative impact on end-user throughput and thus in Quality of Experience (QoE) perceived by end-users. In this paper, a sensitivity analysis of throughput according to Handover (HO) margins is presented and an alternative indicator for tuning HO margins is introduced, focusing on end-user throughput. The assessment is carried out in a trial LTE network. Results show that the proposed indicator improves network performance in terms of end-user throughput from that obtained with classical MLB algorithms.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech

    A QoE-driven traffic steering algorithm for LTE Networks

    Get PDF
    Due to the huge increase in traffic and services in mobile networks, network management has changed its main focus from Quality of Service (QoS) to a Quality of Experience (QoE) perspective. In addition, SON (Self Organizing Networks) techniques have been developed to automate network management, being load balancing a key use case. Load balancing aim is to balance the traffic among adjacent cells. This balance is expected to decrease the overall blocking ratio, thus increasing the total carried traffic in the network. Nevertheless, these techniques may fail when QoE perspective is considered. In this work, a novel QoE balancing algorithm is proposed to reach QoE equilibrium in a realistic LTE network with different services. The proposed balancing approach is tested and compared with classical techniques by means of simulations.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech

    A Data-Driven Traffic Steering Algorithm for Optimizing User Experience in Multi-Tier LTE Networks

    Get PDF
    Multi-tier cellular networks are a cost-effective solution for capacity enhancement in urban scenarios. In these networks, effective mobility strategies are required to assign users to the most adequate layer. In this paper, a data-driven self-tuning algorithm for traffic steering is proposed to improve the overall Quality of Experience (QoE) in multi-carrier Long Term Evolution (LTE) networks. Traffic steering is achieved by changing Reference Signal Received Quality (RSRQ)-based inter-frequency handover margins. Unlike classical approaches considering cell-aggregated counters to drive the tuning process, the proposed algorithm relies on a novel indicator, derived from connection traces, showing the impact of handovers on user QoE. Method assessment is carried out in a dynamic system-level simulator implementing a real multicarrier LTE scenario. Results show that the proposed algorithm significantly improves QoE figures obtained with classical load balancing techniques.Spanish Ministry of Economy and Competitiveness under Grant TEC2015-69982-R, in part by the Spanish Ministry of Education, Culture and Sports under FPU Grant FPU17/04286, and in part by the Horizon 2020 Project ONE5G under Grant ICT-76080

    Coordination and load analysis of C-RAN in HetNets by graph-partitioning

    Get PDF
    In 5G systems, ultra-dense networks are a promising technique to cope strong increase of traffic data in mobile communications. In addition, the deployment of indoor small cells offloads the wireless system from macrocells at the cost of increasing network complexity. In this work, a method for capacity analysis of Centralized Radio Access Networks (C-RANs) comprising macrocells and small cells is proposed. Radio remote heads~(RRH) are grouped to a Base Band Unit~(BBU) pools using graph theory techniques. For this purpose, the impact of Inter-Cell Interference Coordination (ICIC) and Coordinated Multi-Point Transmission/Reception (CoMP) techniques on the network is assessed under different load levels and coordination restrictions. Assessment is carried out by using a radio planning tool that allows to characterize spectral efficiency and allocation of shared resources per cell over a realistic Long-Term Evolution (LTE) heterogeneous network. Results show that load and coordination conditions between cells are key to improve system capacity.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech

    On the Improvement of Cellular Coverage Maps by Filtering MDT Measurements

    Get PDF
    In cellular systems, network re-planning aims to update network configuration to cope with permanent changes in the environment. In this task, terminal measurements are often used to calibrate performance models integrated in radio network planning tools. In Release 10 of the 3GPP standard, the Minimization of Drive Test (MDT) feature allows the collection of user position correlated to performance statistics or radio events. In practice, positioning errors severely limit the potential of MDT measurements. In this work, a preliminary analysis of a large MDT dataset taken from a commercial Long-Term Evolution (LTE) network shows for the first time several sources of positioning errors not previously reported in the literature. Then, a heuristic filtering algorithm is proposed to discard samples with inaccurate location data. Method assessment is done by checking the impact of filtering on the coverage map built with a real MDT dataset. Results show that the proposed method significantly improves the accuracy of coverage maps by filtering unreliable measurements.European Union’s Horizon 2020 Research and Innovation Programme under the Project H2020 LOCUS under (Grant 871249

    A predictive analysis of slice performance in B5G Systems with network slicing

    Get PDF
    En los sistemas 5G y posteriores, la segmentación de red (Network Slicing, NS) permite la operación simultánea de múltiples redes lógicas personalizadas para sectores verticales específicos sobre una infraestructura física común. En la red de acceso radio, los operadores necesitan prever el rendimiento de los segmentos para una (re)distribución eficiente de los recursos radio entre los mismos. En los últimos años, los modelos basados en el aprendizaje supervisado (Supervised Learning, SL) han mostrado un excelente rendimiento para tareas de predicción en diversos campos. Aun así, un análisis preliminar de las series temporales de indicadores de rendimiento (Key Performance Indicators, KPIs) a nivel de segmento es clave para seleccionar el predictor basado en SL óptimo. Este trabajo presenta un juego de datos de KPI a nivel de segmento creado con un simulador dinámico que emula una red de acceso de radio 5G realista con NS. El juego de datos incluye medidas históricas de varios KPI recopilados por célula y segmento durante 15 minutos de actividad de la red. Sobre él, se realiza un análisis de correlación cruzada, auto correlación y estacionalidad, con el objetivo de caracterizar las series temporales de KPIs recopilados a nivel de segmento. Los resultados han mostrado que algunos aspectos clave para el diseño de modelos de predicción (por ejemplo, el comportamiento estacional, la predictibilidad o la correlación entre distintos KPIs) dependen en gran medida de ambos la resolución temporal de los datos y del segmento. Se espera que modelos de predicción multi-KPI con detección automática de estacionalidad entrenados específicamente para cada segmento consigan los mejores resultados.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech

    A data-driven scheduler model for QoE assessment in a LTE radio network planning tool

    Get PDF
    The use of static system-level simulators is common practice for estimating the impact of re-planning actions in cellular networks. In this paper, a modification of a classical static Long Term Evolution (LTE) simulator is proposed to estimate the Quality of Experience (QoE) provided in each location on a per-service basis. The core of the simulator is the estimation of radio connection throughput on a location and service basis. For this purpose, a new analytical performance model for the packet scheduling process in a multi-service scenario is developed. Model parameters can easily be adjusted with information from radio connection traces available in the network management system. The simulation tool is validated with a large trace dataset taken from a live LTE network

    Traffic Steering in B5G Sliced Radio Access Networks.

    Get PDF
    In 5G and beyond wireless systems, Network Slicing (NS) feature will enable the coexistence of extremely different services by splitting the physical infrastructure into several logical slices tailored for a specific tenant or application. In sliced Radio Access Networks (RANs), an optimal traffic sharing among cells is key to guarantee Service Level Agreement (SLA) compliance while minimizing operation costs. The configuration of network functions leading to that optimal point may depend on the slice, claiming for slice-aware traffic steering strategies. This work presents the first data-driven algorithm for sliceaware traffic steering by tuning handover margins (a.k.a. mobility load balancing). The tuning process is driven by a novel indicator, derived from connection traces, showing the imbalance of SLA compliance among neighbor cells per slice. Performance assessment is carried out with a system-level simulator implementing a realistic sliced RAN offering services with different throughput, latency and reliability requirements. Results show that the proposed algorithm improves the overall SLA compliance by 9% in only 15 minutes of network activity compared to the case of not steering traffic, outperforming two legacy mobility load balancing approaches not driven by SLA
    corecore