65 research outputs found

    LTE-advanced self-organizing network conflicts and coordination algorithms

    Get PDF
    Self-organizing network (SON) functions have been introduced in the LTE and LTEAdvanced standards by the Third Generation Partnership Project as an excellent solution that promises enormous improvements in network performance. However, the most challenging issue in implementing SON functions in reality is the identification of the best possible interactions among simultaneously operating and even conflicting SON functions in order to guarantee robust, stable, and desired network operation. In this direction, the first step is the comprehensive modeling of various types of conflicts among SON functions, not only to acquire a detailed view of the problem, but also to pave the way for designing appropriate Self-Coordination mechanisms among SON functions. In this article we present a comprehensive classification of SON function conflicts, which leads the way for designing suitable conflict resolution solutions among SON functions and implementing SON in reality. Identifying conflicting and interfering relations among autonomous network management functionalities is a tremendously complex task. We demonstrate how analysis of fundamental trade-offs among performance metrics can us to the identification of potential conflicts. Moreover, we present analytical models of these conflicts using reference signal received power plots in multi-cell environments, which help to dig into the complex relations among SON functions. We identify potential chain reactions among SON function conflicts that can affect the concurrent operation of multiple SON functions in reality. Finally, we propose a selfcoordination framework for conflict resolution among multiple SON functions in LTE/LTEAdvanced networks, while highlighting a number of future research challenges for conflict-free operation of SON

    Improving mobile video quality through predictive channel quality based buffering

    Get PDF
    Frequent variations in throughput make mobile networks a challenging environment for video streaming. Current video players deal with those variations by matching video quality to network throughput. However, this adaptation strategy results in frequent changes of video resolution and bitrate, which negatively impacts the users' streaming experience. Alternatively, keeping the video quality constant would improve the experience, but puts additional demand on the network. Downloading high quality content when channel quality is low requires additional resources, because data transfer efficiency is linked to channel quality. In this paper, we present a predictive Channel Quality based Buffering Strategy (CQBS) that lets the video buffer grow when channel quality is good, and relies on this buffer when channel quality decreases. Our strategy is the outcome of a Markov Decision Process. The underlying Markov chain is conditioned on 377 real-world LTE channel quality traces that we have collected using an Android mobile application. With our strategy, mobile network providers can deliver constant quality video streams, using less network resources

    Measurement and Optimization of LTE Performance

    Get PDF
    4G Long Term Evolution (LTE) mobile system is the fourth generation communication system adopted worldwide to provide high-speed data connections and high-quality voice calls. Given the recent deployment by mobile service providers, unlike GSM and UMTS, LTE can be still considered to be in its early stages and therefore many topics still raise great interest among the international scientific research community: network performance assessment, network optimization, selective scheduling, interference management and coexistence with other communication systems in the unlicensed band, methods to evaluate human exposure to electromagnetic radiation are, as a matter of fact, still open issues. In this work techniques adopted to increase LTE radio performances are investigated. One of the most wide-spread solutions proposed by the standard is to implement MIMO techniques and within a few years, to overcome the scarcity of spectrum, LTE network operators will offload data traffic by accessing the unlicensed 5 GHz frequency. Our Research deals with an evaluation of 3GPP standard in a real test best scenario to evaluate network behavior and performance

    Practical Commercial 5G Standalone (SA) Uplink Throughput Prediction

    Full text link
    While the 5G New Radio (NR) network promises a huge uplift of the uplink throughput, the improvement can only be seen when the User Equipment (UE) is connected to the high-frequency millimeter wave (mmWave) band. With the rise of uplink-intensive smartphone applications such as the real-time transmission of UHD 4K/8K videos, and Virtual Reality (VR)/Augmented Reality (AR) contents, uplink throughput prediction plays a huge role in maximizing the users' quality of experience (QoE). In this paper, we propose using a ConvLSTM-based neural network to predict the future uplink throughput based on past uplink throughput and RF parameters. The network is trained using the data from real-world drive tests on commercial 5G SA networks while riding commuter trains, which accounted for various frequency bands, handover, and blind spots. To make sure our model can be practically implemented, we then limited our model to only use the information available via Android API, then evaluate our model using the data from both commuter trains and other methods of transportation. The results show that our model reaches an average prediction accuracy of 98.9\% with an average RMSE of 1.80 Mbps across all unseen evaluation scenarios

    Energy Efficient Mobility Management for the Macrocell – Femtocell LTE Network

    Get PDF
    Femtocells will play a key role in future deployments of the 3rd Generation Partnership Project (3GPP) the Long Term Evolution (LTE) system, as they are expected to enhance system capacity, and greatly improve the energy-efficiency in a cost-effective manner. Due to the short transmit-receive distance, femtocells prolong handset battery life and enhance the Quality of Service (QoS) perceived by the end users. However, large-scale femtocell deployment comprises many technical challenges, mainly including security, interference and mobility management. Under the viewpoint of energy-efficient mobility management, this chapter discusses the key features of the femtocell technology and presents a novel energy-efficient handover decision policy for the macrocell – femtocell LTE network. The proposed HO decision policy aims at reducing the transmit power of the LTE mobile terminals in a backwards compatible with the standard LTE handover decision procedure. Simulation results show that significantly lower energy and power consumption can be attained if the proposed approach is employed, at the cost of a moderately increased number of handover executions events

    Load-Based Traffic Steering in heterogeneous LTE Networks:A Journey from Release 8 to Release 12

    Get PDF

    Mobility Management for Cellular Networks:From LTE Towards 5G

    Get PDF

    Inter-cell Interference Coordination algorithms for 5G networks

    Get PDF
    L'elaborato affronta il tema dell'Inter-Cell Interference Coordination (ICIC) applicato ad un sistema 5G. Il sistema viene modellato mediante il software di simulazione ns-3. L'approccio utilizzato è quello di unire gli algoritmi di Frequency Reuse, che rappresentano un approccio statico di coordinamento dell'interferenza inter-cella, e il beamforming, caratteristica fondamentale introdotta dallo standard 5G, allo scopo di ottimizzare l'allocazione di risorse verso tutti gli utenti che il sistema cellulare copre. Lo studio effettuato affronta in maniera sistematica le specifiche dello standard 5G, con una particolare attenzione al modo in cui questo viene implementato all'interno del software di simulazione, con lo scopo di attuare modifiche in maniera consapevole delle caratteristiche che lo standard presenta. Infatti, proprio perché lo scenario di partenza non comprende l'applicazione di algoritmi di ICIC, è stato necessario modificare l'architettura iniziale della network già impostata all'interno di ns-3 e realizzare un interfacciamento con gli algoritmi di Frequency Reuse, andando a modificare il modo in cui la Base Station alloca le risorse. Inoltre, è stato necessario introdurre tutta la componente di segnali che utenti e Base Station si scambiano per fornirsi informazioni utili al coordinamento dell'interferenza inter-cella. In particolare, mediante il software viene modellato uno scenario di partenza, rappresentato da un generico stadio, e vengono valutate le performance del sistema in termini di pacchetti ricevuti sui totali pacchetti trasmessi. Con l'applicazione di un coordinamento dell'interferenza tra le celle si raggiungono risultati significativi, che portano ad un incremento delle performance del sistema. Il risultato finale mostra come l'utilizzo di algoritmi di ICIC migliori le performance del sistema grazie alla riduzione dell'interferenza, che permette un'allocazione di maggiori risorse con una perdita di pacchetti significativamente ridotta
    • …
    corecore