99 research outputs found

    Packet scheduling under imperfect channel conditions in Long Term Evolution (LTE)

    Full text link
    University of Technology, Sydney. Faculty of Engineering and Information Technology.The growing demand for high speed wireless data services, such as Voice Over Internet Protocol (VoIP), web browsing, video streaming and gaming, with constraints on system capacity and delay requirements, poses new challenges in future mobile cellular systems. Orthogonal Frequency Division Multiple Access (OFDMA) is the preferred access technology for downlink Long Term Evolution (LTE) standardisation as a solution to the challenges. As a network based on an all-IP packet switched architecture, LTE employs packet scheduling to satisfy Quality of Service (QoS) requirements. Therefore, efficient design of packet scheduling becomes a fundamental issue. The aim of this thesis is to propose a novel packet scheduling algorithm to improve system performance for practical downlink LTE system. This thesis first focuses on time domain packet scheduling algorithms. A number of time domain packet scheduling algorithms are studied and some well-known time domain packet scheduling algorithms are compared in downlink LTE. A packet scheduling algorithm is identified that it is able to provide a better trade-off between maximizing the system performance and guaranteeing the fairness. Thereafter, some frequency domain packet schemes are introduced and examples of QoS aware packet scheduling algorithms employing these schemes are presented. To balance the scheduling performance and computational complexity and be tolerant to the time-varying wireless channel, a novel scheduling scheme and a packet scheduling algorithm are proposed. Simulation results show this proposed algorithm achieves an overall reasonable system performance. Packet scheduling is further studied in a practical channel condition environment which assumes imperfect Channel Quality Information (CQI). To alleviate the performance degradation due to simultaneous multiple imperfect channel conditions, a packet scheduling algorithm based on channel prediction and the proposed scheduling scheme is developed in downlink LTE system for GBR services. It was shown in simulation results that the Kalman filter based channel predictor can effectively recover the correct CQI from erroneous channel quality feedback, therefore, the system performance is significantly improved

    Optimization a Scheduling Algorithm of CA in LTE ADV

    Get PDF
    Long-Term Advancement Progressed (LTE-ADV) is the advancement of the long-term evolution, which created via 3GPP. LTE-ADV aims to offer a transmission bandwidth of (100) MHz by using Carrier Aggregation (CA) to aggregate LTE-ADV carriers. To increase the data capacity of the system and resource allocation converts a very good tool. LTE-Advanced multiple Component Carriers (CCs) becomes a difficult optimization problem. In the paper proposes a new scheduling algorithm and compares with a different scheduling traditional algorithms that are proportional fair and round robin in the CA, in order to find the best scheduler that provides high-quality throughput and improves fairness. It i also evaluates mapping model types are Mutual Information Effective SINR Mapping (MIESM) and Exponential Effective SINR Mapping (EESM). The results show that the throughput in the proposed algorithm with MIESM outperforms from others mapping and scheduling

    Sustainable scheduling policies for radio access networks based on LTE technology

    Get PDF
    A thesis submitted to the University of Bedfordshire in partial fulfilment of the requirements for the degree of Doctor of PhilosophyIn the LTE access networks, the Radio Resource Management (RRM) is one of the most important modules which is responsible for handling the overall management of radio resources. The packet scheduler is a particular sub-module which assigns the existing radio resources to each user in order to deliver the requested services in the most efficient manner. Data packets are scheduled dynamically at every Transmission Time Interval (TTI), a time window used to take the user’s requests and to respond them accordingly. The scheduling procedure is conducted by using scheduling rules which select different users to be scheduled at each TTI based on some priority metrics. Various scheduling rules exist and they behave differently by balancing the scheduler performance in the direction imposed by one of the following objectives: increasing the system throughput, maintaining the user fairness, respecting the Guaranteed Bit Rate (GBR), Head of Line (HoL) packet delay, packet loss rate and queue stability requirements. Most of the static scheduling rules follow the sequential multi-objective optimization in the sense that when the first targeted objective is satisfied, then other objectives can be prioritized. When the targeted scheduling objective(s) can be satisfied at each TTI, the LTE scheduler is considered to be optimal or feasible. So, the scheduling performance depends on the exploited rule being focused on particular objectives. This study aims to increase the percentage of feasible TTIs for a given downlink transmission by applying a mixture of scheduling rules instead of using one discipline adopted across the entire scheduling session. Two types of optimization problems are proposed in this sense: Dynamic Scheduling Rule based Sequential Multi-Objective Optimization (DSR-SMOO) when the applied scheduling rules address the same objective and Dynamic Scheduling Rule based Concurrent Multi-Objective Optimization (DSR-CMOO) if the pool of rules addresses different scheduling objectives. The best way of solving such complex optimization problems is to adapt and to refine scheduling policies which are able to call different rules at each TTI based on the best matching scheduler conditions (states). The idea is to develop a set of non-linear functions which maps the scheduler state at each TTI in optimal distribution probabilities of selecting the best scheduling rule. Due to the multi-dimensional and continuous characteristics of the scheduler state space, the scheduling functions should be approximated. Moreover, the function approximations are learned through the interaction with the RRM environment. The Reinforcement Learning (RL) algorithms are used in this sense in order to evaluate and to refine the scheduling policies for the considered DSR-SMOO/CMOO optimization problems. The neural networks are used to train the non-linear mapping functions based on the interaction among the intelligent controller, the LTE packet scheduler and the RRM environment. In order to enhance the convergence in the feasible state and to reduce the scheduler state space dimension, meta-heuristic approaches are used for the channel statement aggregation. Simulation results show that the proposed aggregation scheme is able to outperform other heuristic methods. When the aggregation scheme of the channel statements is exploited, the proposed DSR-SMOO/CMOO problems focusing on different objectives which are solved by using various RL approaches are able to: increase the mean percentage of feasible TTIs, minimize the number of TTIs when the RL approaches punish the actions taken TTI-by-TTI, and minimize the variation of the performance indicators when different simulations are launched in parallel. This way, the obtained scheduling policies being focused on the multi-objective criteria are sustainable. Keywords: LTE, packet scheduling, scheduling rules, multi-objective optimization, reinforcement learning, channel, aggregation, scheduling policies, sustainable

    Resource Allocation in Heterogeneous Networks

    Get PDF

    Age-Based Metrics for Joint Control and Communication in Cyber-Physical Industrial Systems

    Get PDF

    Link level performance evaluation and link abstraction for LTE/LTE-advanced downlink

    Get PDF
    Els objectius principals d'aquesta tesis són l'avaluació del rendiment a nivell d'enllaç i l'estudi de l'abstracció de l'enllaç pel LTE/LTE-Advanced DL. S’ha desenvolupat un simulador del nivell d'enllaç E-UTRA DL basat en la tecnologia MIMO-OFDM. Es simulen els errors d'estimació de canal amb un model d'error de soroll additiu Gaussià anomenat CEEM. El resultat d'aquest simulador serveix per avaluar el rendiment a nivell d'enllaç del LTE/LTE-Advanced DL en diferents entorns . La idea bàsica dels mètodes d'abstracció de l'enllaç és mapejar el vector de SNRs de les subportadores a un valor escalar, l'anomenada ESNR, la qual és usada per a predir la BLER. Proposem un innovador mètode d'abstracció de l'enllaç que pot predir la BLER amb bona precisió en esvaïments multicamí i que inclouen els efectes de les retransmissions HARQ. El mètode proposat es basa amb l'estimació de la informació mútua entre els bits transmesos i els LLRs rebuts.The main objectives of this dissertation are the evaluation of the link level performance and the study of link abstraction for LTE/LTE-Advanced DL. An E-UTRA DL link level simulator has been developed based on MIMO-OFDM technology. We simulate channel estimation errors by a Gaussian additive noise error model called CEEM. The result of this simulator serves to evaluate the MIMO-OFDM LTE/LTE-Advanced DL link level performance in different environments. The basic idea of link abstraction methods is to map the vector of the subcarrier SNRs to a single scalar, the ESNR, which is then used to predict the BLER. We propose a novel link abstraction method that can predict the BLER with good accuracy in multipath fading and including the effects of HARQ retransmissions. The proposed method is based on estimating the mutual information between the transmitted bits and the received LLRs.Postprint (published version

    LTE-verkon suorituskyvyn parantaminen CDMA2000:sta LTE:hen tehdyn muutoksen jälkeen

    Get PDF
    CDMA2000 technology has been widely used on 450 MHz band. Recently the equipment availability and improved performance offered by LTE has started driving the operators to migrate their networks from CDMA2000 to LTE. The migration may cause the network performance to be in suboptimal state. This thesis presents four methods to positively influence LTE network performance after CDMA2000 to LTE migration, especially on 450 MHz band. Furthermore, three of the four presented methods are evaluated in a live network. The measured three methods were cyclic prefix length, handover parameter optimization and uplink coordinated multipoint (CoMP) transmission. The objective was to determine the effectiveness of each method. The research methods included field measurements and network KPI collection. The results show that normal cyclic prefix length is enough for LTE450 although the cell radius may be up to 50km. Only special cases exist where cyclic prefix should be extended. Operators should consider solving such problems individually instead of widely implementing extended cyclic prefix. Handover parameter optimization turned out to be an important point of attention after CDMA2000 to LTE migration. It was observed that if the handover parameters are not concerned, significant amount of unnecessary handovers may happen. It was evaluated that about 50% of the handovers in the network were unnecessary in the initial situation. By adjusting the handover parameter values 47,28 % of the handovers per user were removed and no negative effects were detected. Coordinated multipoint transmission has been widely discussed to be an effective way to improve LTE network performance, especially at the cell edges. Many challenges must be overcome before it can be applied to downlink. Also, implementing it to function between cells in different eNBs involve challenges. Thus, only intra-site uplink CoMP transmission was tested. The results show that the performance improvements were significant at the cell edges as theory predicted.CDMA2000 teknologiaa on laajalti käytetty 450 MHz:n taajuusalueella. Viime aikoina LTE:n tarjoamat halvemmat laitteistot ja parempi suorituskyky ovat kannustaneet operaattoreita muuttamaan verkkoaan CDMA2000:sta LTE:hen. Kyseinen muutos saattaa johtaa epäoptimaaliseen tilaan verkon suorituskyvyn kannalta. Tämä työ esittelee neljä menetelmää, joilla voidaan positiivisesti vaikuttaa LTE-verkon suorituskykyyn CDMA2000:ste LTE:hen tehdyn muutoksen jälkeen erityisesti 450 MHz:n taajuusalueella. Kolmea näistä menetelmistä arvioidaan tuotantoverkossa. Nämä kolme menetelmää ovat suojavälin pituus, solunvaihtoparametrien optimointi ja ylälinkin koordinoitu monipistetiedonsiirto. Tavoite oli määrittää kunkin menetelmän vaikutus. Tutkimusmenetelmiin kuului kenttämittaukset ja verkon suorituskykymittareiden analyysi. Tutkimustulosten perusteella voidaan sanoa, että normaali suojaväli on riittävän pitkä LTE450:lle vaikka solujen säde on jopa 50km. Vain erikoistapauksissa tarvitaan pidennettyä suojaväliä. Operaattoreiden tulisi ratkaista tällaiset tapaukset yksilöllisesti sen sijaan, että koko verkossa käytettäisiin pidennettyä suojaväliä. Solunvaihtoparametrien optimointi osoittautui tärkeäksi huomion aiheeksi CDMA2000:sta LTE:hen tehdyn muutoksen jälkeen. Turhia solunvaihtoja saattaa tapahtua merkittäviä määriä, mikäli parametreihin ei kiinnitetä huomiota. Lähtötilanteessa noin 50 % testiverkon solunvaihdoista arvioitiin olevan turhia. Solunvaihtoparametreja muuttamalla 47,28 % solunvaihdoista per käyttäjä saatiin poistettua ilman, että mitään haittavaikutuksia olisi huomattu. Koordinoidun monipistetiedonsiirron on laajalti sanottu olevan tehokas tapa parantaa LTE-verkon suorituskykyä, etenkin solujen reunoilla. Monia haasteita pitää ratkaista, enne kuin sitä voidaan käyttää alalinkin tiedonsiirtoon. Lisäksi sen käyttöön eri tukiasemien solujen välillä liittyy haasteita. Tästä syystä monipistetiedonsiirtoa voitiin testata vain ylälinkin suuntaan ja vain yhden tukiaseman välisten solujen kesken. Tulokset osoittivat, että suorituskyky parani merkittävästi solun reunalla

    Multi-Service Radio Resource Management for 5G Networks

    Get PDF
    corecore