55 research outputs found

    Radio resource management for OFDMA systems under practical considerations.

    Get PDF
    Orthogonal frequency division multiple access (OFDMA) is used on the downlink of broadband wireless access (BWA) networks such as Worldwide Interoperability for Microwave Access (WiMAX) and Long Term Evolution (LTE) as it is able to offer substantial advantages such as combating channel impairments and supporting higher data rates. Also, by dynamically allocating subcarriers to users, frequency domain diversity as well as multiuser diversity can be effectively exploited so that performance can be greatly improved. The main focus of this thesis is on the development of practical resource allocation schemes for the OFDMA downlink. Imperfect Channel State Information (CSI), the limited capacity of the dedicated link used for CSI feedback, and the presence of a Connection Admission Control (CAC) unit are issues that are considered in this thesis to develop practical schemes. The design of efficient resource allocation schemes heavily depends on the CSI reported from the users to the transmitter. When the CSI is imperfect, a performance degradation is realized. It is therefore necessary to account for the imperfectness of the CSI when assigning radio resources to users. The first part of this thesis considers resource allocation strategies for OFDMA systems, where the transmitter only knows the statistical knowledge of the CSI (SCSI). The approach used shows that resources can be optimally allocated to achieve a performance that is comparable to that achieved when instantaneous CSI (ICSI) is available. The results presented show that the performance difference between the SCSI and ICSI based resource allocation schemes depends on the number of active users present in the cell, the Quality of Service (QoS) constraint, and the signal-to- noise ratio (SNR) per subcarrier. In practical systems only SCSI or CSI that is correlated to a certain extent with the true channel state can be used to perform resource allocation. An approach to quantifying the performance degradation for both cases is presented for the case where only a discrete number of modulation and coding levels are available for adaptive modulation and coding (AMC). Using the CSI estimates and the channel statistics, the approach can be used to perform resource allocation for both cases. It is shown that when a CAC unit is considered, CSI that is correlated with its present state leads to significantly higher values of the system throughput even under high user mobility. Motivated by the comparison between the correlated and statistical based resource allocation schemes, a strategy is then proposed which leads to a good tradeoff between overhead consumption and fairness as well as throughput when the presence of a CAC unit is considered. In OFDMA networks, the design of efficient CAC schemes also relies on the user CSI. The presence of a CAC unit needs to be considered when designing practical resource allocation schemes for BWA networks that support multiple service classes as it can guarantee fairness amongst them. In this thesis, a novel mechanism for CAC is developed which is based on the user channel gains and the cost of each service. This scheme divides the available bandwidth in accordance with a complete partitioning structure which allocates each service class an amount of non-overlapping bandwidth resource. In summary, the research results presented in this thesis contribute to the development of practical radio resource management schemes for BWA networks

    Robust scheduling algorithm for Guaranteed Bit Rate services

    Full text link
    This paper proposes a novel packet scheduling algorithm to overcome detrimental effects of channel impairments on the quality of service of delay-sensitive Guaranteed Bit Rate (GBR) services. The proposed algorithm prioritises the packets that require retransmission of Hybrid Automatic Repeat Request (HARQ) users compared to the packets of new users. The packets of new users are scheduled according to the Channel Quality Information (CQI), average throughput and packet delay information. Computer simulations have demonstrated that the proposed algorithm has 22.7% system capacity improvement over a well-known algorithm. It also tolerates for up to 200% delay of CQI and reduces the uplink signalling overhead by 150% compared to the well-known algorithm without compromising the quality of service requirements of the GBR services. Copyright © 2013 Inderscience Enterprises Ltd

    Scheduling Policies in Time and Frequency Domains for LTE Downlink Channel: A Performance Comparison

    Get PDF
    A key feature of the Long-Term Evolution (LTE) system is that the packet scheduler can make use of the channel quality information (CQI), which is periodically reported by user equipment either in an aggregate form for the whole downlink channel or distinguished for each available subchannel. This mechanism allows for wide discretion in resource allocation, thus promoting the flourishing of several scheduling algorithms, with different purposes. It is therefore of great interest to compare the performance of such algorithms under different scenarios. Here, we carry out a thorough performance analysis of different scheduling algorithms for saturated User Datagram Protocol (UDP) and Transmission Control Protocol (TCP) traffic sources, as well as consider both the time- and frequency-domain versions of the schedulers and for both flat and frequency-selective channels. The analysis makes it possible to appreciate the difference among the scheduling algorithms and to assess the performance gain, in terms of cell capacity, users' fairness, and packet service time, obtained by exploiting the richer, but heavier, information carried by subchannel CQI. An important part of this analysis is a throughput guarantee scheduler, which we propose in this paper. The analysis reveals that the proposed scheduler provides a good tradeoff between cell capacity and fairness both for TCP and UDP traffic sources

    LTE-Advanced radio access enhancements: A survey

    Get PDF
    Long Term Evolution Advanced (LTE-Advanced) is the next step in LTE evolution and allows operators to improve network performance and service capabilities through smooth deployment of new techniques and technologies. LTE-Advanced uses some new features on top of the existing LTE standards to provide better user experience and higher throughputs. Some of the most significant features introduced in LTE-Advanced are carrier aggregation, enhancements in heterogeneous networks, coordinated multipoint transmission and reception, enhanced multiple input multiple output usage and deployment of relay nodes in the radio network. Mentioned features are mainly aimed to enhance the radio access part of the cellular networks. This survey article presents an overview of the key radio access features and functionalities of the LTE-Advanced radio access network, supported by the simulation results. We also provide a detailed review of the literature together with a very rich list of the references for each of the features. An LTE-Advanced roadmap and the latest updates and trends in LTE markets are also presented

    Resource Allocation for the Long Term Evolution (LTE)of3G

    Get PDF

    Sustainable scheduling policies for radio access networks based on LTE technology

    Get PDF
    A thesis submitted to the University of Bedfordshire in partial fulfilment of the requirements for the degree of Doctor of PhilosophyIn the LTE access networks, the Radio Resource Management (RRM) is one of the most important modules which is responsible for handling the overall management of radio resources. The packet scheduler is a particular sub-module which assigns the existing radio resources to each user in order to deliver the requested services in the most efficient manner. Data packets are scheduled dynamically at every Transmission Time Interval (TTI), a time window used to take the user’s requests and to respond them accordingly. The scheduling procedure is conducted by using scheduling rules which select different users to be scheduled at each TTI based on some priority metrics. Various scheduling rules exist and they behave differently by balancing the scheduler performance in the direction imposed by one of the following objectives: increasing the system throughput, maintaining the user fairness, respecting the Guaranteed Bit Rate (GBR), Head of Line (HoL) packet delay, packet loss rate and queue stability requirements. Most of the static scheduling rules follow the sequential multi-objective optimization in the sense that when the first targeted objective is satisfied, then other objectives can be prioritized. When the targeted scheduling objective(s) can be satisfied at each TTI, the LTE scheduler is considered to be optimal or feasible. So, the scheduling performance depends on the exploited rule being focused on particular objectives. This study aims to increase the percentage of feasible TTIs for a given downlink transmission by applying a mixture of scheduling rules instead of using one discipline adopted across the entire scheduling session. Two types of optimization problems are proposed in this sense: Dynamic Scheduling Rule based Sequential Multi-Objective Optimization (DSR-SMOO) when the applied scheduling rules address the same objective and Dynamic Scheduling Rule based Concurrent Multi-Objective Optimization (DSR-CMOO) if the pool of rules addresses different scheduling objectives. The best way of solving such complex optimization problems is to adapt and to refine scheduling policies which are able to call different rules at each TTI based on the best matching scheduler conditions (states). The idea is to develop a set of non-linear functions which maps the scheduler state at each TTI in optimal distribution probabilities of selecting the best scheduling rule. Due to the multi-dimensional and continuous characteristics of the scheduler state space, the scheduling functions should be approximated. Moreover, the function approximations are learned through the interaction with the RRM environment. The Reinforcement Learning (RL) algorithms are used in this sense in order to evaluate and to refine the scheduling policies for the considered DSR-SMOO/CMOO optimization problems. The neural networks are used to train the non-linear mapping functions based on the interaction among the intelligent controller, the LTE packet scheduler and the RRM environment. In order to enhance the convergence in the feasible state and to reduce the scheduler state space dimension, meta-heuristic approaches are used for the channel statement aggregation. Simulation results show that the proposed aggregation scheme is able to outperform other heuristic methods. When the aggregation scheme of the channel statements is exploited, the proposed DSR-SMOO/CMOO problems focusing on different objectives which are solved by using various RL approaches are able to: increase the mean percentage of feasible TTIs, minimize the number of TTIs when the RL approaches punish the actions taken TTI-by-TTI, and minimize the variation of the performance indicators when different simulations are launched in parallel. This way, the obtained scheduling policies being focused on the multi-objective criteria are sustainable. Keywords: LTE, packet scheduling, scheduling rules, multi-objective optimization, reinforcement learning, channel, aggregation, scheduling policies, sustainable
    • …
    corecore