2,804 research outputs found

    Packet scheduling under imperfect channel conditions in Long Term Evolution (LTE)

    Full text link
    University of Technology, Sydney. Faculty of Engineering and Information Technology.The growing demand for high speed wireless data services, such as Voice Over Internet Protocol (VoIP), web browsing, video streaming and gaming, with constraints on system capacity and delay requirements, poses new challenges in future mobile cellular systems. Orthogonal Frequency Division Multiple Access (OFDMA) is the preferred access technology for downlink Long Term Evolution (LTE) standardisation as a solution to the challenges. As a network based on an all-IP packet switched architecture, LTE employs packet scheduling to satisfy Quality of Service (QoS) requirements. Therefore, efficient design of packet scheduling becomes a fundamental issue. The aim of this thesis is to propose a novel packet scheduling algorithm to improve system performance for practical downlink LTE system. This thesis first focuses on time domain packet scheduling algorithms. A number of time domain packet scheduling algorithms are studied and some well-known time domain packet scheduling algorithms are compared in downlink LTE. A packet scheduling algorithm is identified that it is able to provide a better trade-off between maximizing the system performance and guaranteeing the fairness. Thereafter, some frequency domain packet schemes are introduced and examples of QoS aware packet scheduling algorithms employing these schemes are presented. To balance the scheduling performance and computational complexity and be tolerant to the time-varying wireless channel, a novel scheduling scheme and a packet scheduling algorithm are proposed. Simulation results show this proposed algorithm achieves an overall reasonable system performance. Packet scheduling is further studied in a practical channel condition environment which assumes imperfect Channel Quality Information (CQI). To alleviate the performance degradation due to simultaneous multiple imperfect channel conditions, a packet scheduling algorithm based on channel prediction and the proposed scheduling scheme is developed in downlink LTE system for GBR services. It was shown in simulation results that the Kalman filter based channel predictor can effectively recover the correct CQI from erroneous channel quality feedback, therefore, the system performance is significantly improved

    A Novel Packet Scheduling Scheme for Downlink LTE System

    Get PDF
    [[abstract]]Long term evolution (LTE) is the next generation wireless system. There are not many researches for LTE downlink scheduling. It uses orthogonal frequency division multiple access (OFDMA) in downlink. Until now, the goal for the LTE scheduler is achieving the system highest performance, but it will cause to lower priority connection delay or starvation under limited bandwidth resources. Therefore, we design a LTE downlink scheduling scheme and resource allocation strategy which are not only to achieve the system highest performance, but also avoid latency and starvation problem.[[notice]]補正完畢[[conferencetype]]國際[[conferencedate]]20111014~20111016[[iscallforpapers]]Y[[conferencelocation]]Dalian, China[[countrycodes]]CH

    Prediction-Based Energy Saving Mechanism in 3GPP NB-IoT Networks

    Get PDF
    The current expansion of the Internet of things (IoT) demands improved communication platforms that support a wide area with low energy consumption. The 3rd Generation Partnership Project introduced narrowband IoT (NB-IoT) as IoT communication solutions. NB-IoT devices should be available for over 10 years without requiring a battery replacement. Thus, a low energy consumption is essential for the successful deployment of this technology. Given that a high amount of energy is consumed for radio transmission by the power amplifier, reducing the uplink transmission time is key to ensure a long lifespan of an IoT device. In this paper, we propose a prediction-based energy saving mechanism (PBESM) that is focused on enhanced uplink transmission. The mechanism consists of two parts: first, the network architecture that predicts the uplink packet occurrence through a deep packet inspection; second, an algorithm that predicts the processing delay and pre-assigns radio resources to enhance the scheduling request procedure. In this way, our mechanism reduces the number of random accesses and the energy consumed by radio transmission. Simulation results showed that the energy consumption using the proposed PBESM is reduced by up to 34% in comparison with that in the conventional NB-IoT method

    End-to-End Simulation of 5G mmWave Networks

    Full text link
    Due to its potential for multi-gigabit and low latency wireless links, millimeter wave (mmWave) technology is expected to play a central role in 5th generation cellular systems. While there has been considerable progress in understanding the mmWave physical layer, innovations will be required at all layers of the protocol stack, in both the access and the core network. Discrete-event network simulation is essential for end-to-end, cross-layer research and development. This paper provides a tutorial on a recently developed full-stack mmWave module integrated into the widely used open-source ns--3 simulator. The module includes a number of detailed statistical channel models as well as the ability to incorporate real measurements or ray-tracing data. The Physical (PHY) and Medium Access Control (MAC) layers are modular and highly customizable, making it easy to integrate algorithms or compare Orthogonal Frequency Division Multiplexing (OFDM) numerologies, for example. The module is interfaced with the core network of the ns--3 Long Term Evolution (LTE) module for full-stack simulations of end-to-end connectivity, and advanced architectural features, such as dual-connectivity, are also available. To facilitate the understanding of the module, and verify its correct functioning, we provide several examples that show the performance of the custom mmWave stack as well as custom congestion control algorithms designed specifically for efficient utilization of the mmWave channel.Comment: 25 pages, 16 figures, submitted to IEEE Communications Surveys and Tutorials (revised Jan. 2018

    Frame Structure Design and Analysis for Millimeter Wave Cellular Systems

    Full text link
    The millimeter-wave (mmWave) frequencies have attracted considerable attention for fifth generation (5G) cellular communication as they offer orders of magnitude greater bandwidth than current cellular systems. However, the medium access control (MAC) layer may need to be significantly redesigned to support the highly directional transmissions, ultra-low latencies and high peak rates expected in mmWave communication. To address these challenges, we present a novel mmWave MAC layer frame structure with a number of enhancements including flexible, highly granular transmission times, dynamic control signal locations, extended messaging and ability to efficiently multiplex directional control signals. Analytic formulae are derived for the utilization and control overhead as a function of control periodicity, number of users, traffic statistics, signal-to-noise ratio and antenna gains. Importantly, the analysis can incorporate various front-end MIMO capability assumptions -- a critical feature of mmWave. Under realistic system and traffic assumptions, the analysis reveals that the proposed flexible frame structure design offers significant benefits over designs with fixed frame structures similar to current 4G long-term evolution (LTE). It is also shown that fully digital beamforming architectures offer significantly lower overhead compared to analog and hybrid beamforming under equivalent power budgets.Comment: Submitted to IEEE Transactions for Wireless Communication
    corecore