9,316 research outputs found

    Performance Analysis of PCFICH and PDCCH LTE Control Channels

    Get PDF
    Control channels play a key role in the evaluation of mobile system performance. The purpose of our paper is to evaluate the performance of the control channels implementation in the Long Term Evolution (LTE) system. The paper deals with the simulation of the complete signal processing chain for Physical Control Format Indicator Channel (PCFICH) and Physical Downlink Control Channel (PDCCH) in the LTE system, Release 8. We implemented a complete signal processing chain for downlink control channels as an extension of the existing MATLAB LTE downlink simulator. The paper presents results of PCFICH and PDCCH control channel computer performance analysis in various channel conditions. The results can be compared with the performance of data channels

    Performance of Well Known Packet Scheduling Algorithms in the Downlink 3GPP LTE System

    Full text link
    This paper investigates the performance of well known packet scheduling algorithms developed for single carrier wireless systems from a real time video streaming perspective. The performance evaluation is conducted using the downlink third generation partnership project long term evolution (3GPP LTE) system as the simulation platform. This paper contributes to the identification of a suitable packet scheduling algorithm for use in the downlink 3GPP LTE system supporting video streaming services. Simulation results show that, in the downlink 3GPP LTE system supporting video streaming services, maximum-largest weighted delay first (M-LWDF) algorithm outperforms other packet scheduling algorithms by providing a higher system throughput, supporting a higher number of users and guaranteeing fairness at a satisfactory level

    he Study and Analysis of Effect of Multi-Antenna Techniques on LTE network with Different Bandwidth Configurations in the Downlink

    Get PDF
    Long Term Evolution (LTE) system adapts advanced Multiple Input Multiple Output (MIMO) antenna techniques on both uplink and downlink to achieve high peak data rates and higher system throughput. This enables LTE to support multimedia applications beyond web browsing and voice, which demands higher bandwidth configurations. LTE employs Orthogonal Frequency Division Multiple Access (OFDMA) in downlink to support spectrum flexibility in order to use upto 20MHz system bandwidth to improve the system throughput and robustness. Therefore the combined study of multi-antenna techniques and spectrum flexibility usage on the performance of LTE system becomes vital. Hence in this paper, an attempt has been made to evaluate the performance of different multi-antenna techniques with various system bandwidth configurations from 1.4MHz to 20MHz using QualNet 5.2 network simulator. The multi-antenna techniques considered for performance evaluation are Single Input Single Output (SISO), Multiple Input Single Output (MISO) and Multiple Input Multiple Output (MIMO). The performance metrics such as aggregate bytes received, average throughput, average delay and average jitter are considered for simulation study

    The Study and Analysis of Effect of Multi- Antenna Techniques on LTE network with Different Bandwidth Configurations in the Downlink

    Get PDF
    Long Term Evolution (LTE) system adapts advanced Multiple Input Multiple Output (MIMO) antenna techniques on both uplink and downlink to achieve high peak data rates and higher system throughput. This enables LTE to support multimedia applications beyond web browsing and voice, which demands higher bandwidth configurations. LTE employs Orthogonal Frequency Division Multiple Access (OFDMA) in downlink to support spectrum flexibility in order to use upto 20MHz system bandwidth to improve the system throughput and robustness. Therefore the combined study of multi-antenna techniques and spectrum flexibility usage on the performance of LTE system becomes vital. Hence in this paper, an attempt has been made to evaluate the performance of different multi-antenna techniques with various system bandwidth configurations from 1.4MHz to 20MHz using QualNet 5.2 network simulator. The multi-antenna techniques considered for performance evaluation are Single Input Single Output (SISO), Multiple Input Single Output (MISO) and Multiple Input Multiple Output (MIMO). The performance metrics such as aggregate bytes received, average throughput, average delay and average jitter are considered for simulation study

    Performance analysis of packet scheduling algorithms for long term evolution (LTE)

    Full text link
    University of Technology, Sydney. Faculty of Engineering and Information Technology.The third generation partnership project long term evolution (3GPP LTE) system is proposed as a new radio access technology in order to support high-speed data and multimedia traffic. The 3GPP LTE system has a flat radio access network architecture consisting of only one node, known as eNodeB, between user and core network. All radio resource management (RRM) functions are performed at the eNodeB. As one of the essential RRM functions, packet scheduling is responsible for the intelligent allocation of radio resources for active users. Since there is a diversity of the traffic types in wireless systems, active users may have different Quality of Service (QoS) requirements. In order to satisfy various QoS requirements and efficiently utilize the radio resources, a packet scheduler adopts a specific packet scheduling algorithm when making decisions. Several packet scheduling algorithms have been proposed in the literature. The objective of this thesis is to evaluate the performance of the well-known and some recently proposed packet scheduljng algorithms and identify the suitability of these algorithms in the downlink LTE system. The perfonnance evaluation of packet scheduling algorithms based on both computer simulation and theoretical analysis is provided in this thesis. The performance of packet scheduling algorithms is evaluated in three scenarios including 100% RT scenario, 100% NRT scenario and 50% RT and 50% NRT scenario under the downlink LTE simulation environment. The simulation results for wellknown packet scheduling algorithms show that Maximum-Largest Weighted Delay First (M-LWDF) outperforms other algorithms in the 100% RT scenario, while Exponential/Proportional Fair (EXP/PF) is comparatively more suitable in the 50% RT and 50% NRT scenario. In the 100% NRT scenario, Proportional Fair (PF) and Maximum Rate (Max-Rate) achieve a good throughput and resource block (RB) utilization performance while Round Robin (RR) has the best fairness performance. Additionally, two recently proposed algorithms are evaluated and can be considered as the packet scheduling candidates. The simulation results show that Sun Qiaoyun' s Algorithm is more appropriate than Jeongsik Park's Algorithm for the downlink LTE supporting the real-time traffic. The mathematical model for performance evaluation of the packet scheduling algorithms in the downlink L TE system is discussed in this thesis. The theoretical delay analysis for OFDMA system and the theoretical throughput analysis of PF algorithm is studied and validated in detail. This thesis moves further to theoretical performance analysis of M-L WDF and obtains the analytical result of the expected throughput of M- WDF

    Link and system level simulation of downlink LTE

    Get PDF
    3GPP LTE is the evolution of the UMTS which will make possible to deliver next generation high quality multimedia services according to the users’ expectations. Since Radio Resource Management (RRM) has been recognized as a key point to successfully accomplish this target, the performance evaluation of a multi-cell resource allocation scheme applied to LTE downlink (DL) is presented in this paper. A semi-distributed and a fully-distributed RRM framework are compared on the basis of the obtained system throughput. Detailed link level simulations have also been carried out to properly back up the system level results.Postprint (published version

    Packet Scheduling Study for Heterogeneous Traffic in Downlink 3GPP LTE System

    Full text link
    Long Term Evolution (LTE) network deploys Orthogonal Frequency Division Multiple Access (OFDMA) technology for downlink multi-carrier transmission. To meet the Quality of Service (QoS) requirements for LTE networks, packet scheduling has been employed. Packet scheduling determines when and how the user’s packets are transmitted to the receiver. Therefore effective design of packet scheduling algorithm is an important discussion. The aims of packet scheduling are maximizing system throughput, guaranteeing fairness among users, andminimizing either or both PacketLoss Ratio (PLR)and packet delay. Inthis paper, the performance of two packet scheduling algorithms namely Log Maximum-Largest Weighted Delay First (LOG-MLWDF) and Max Delay Unit (MDU), developed for OFDM(Orthogonal Frequency Division Multiplexing)networks, has been investigated in LTE downlink networks, and acomparison of those algorithmswith a well-known scheduling algorithm namely Maximum-Largest Weighted Delay First(MLWDF) has been studied.The performance evaluation was in terms of system throughput, PLR and fairness index. This study was performed forboth real time (voice and video streaming)and non-real time (best effort)perspectives. Results show that for streaming flows,LOG-MLWDF shows best PLR performance among the considered scheduling schemes, and for best effort flows, it outperforms theother two algorithms in terms of packet delay and throughput

    Feasibility Study of Enabling V2X Communications by LTE-Uu Radio Interface

    Full text link
    Compared with the legacy wireless networks, the next generation of wireless network targets at different services with divergent QoS requirements, ranging from bandwidth consuming video service to moderate and low date rate machine type services, and supporting as well as strict latency requirements. One emerging new service is to exploit wireless network to improve the efficiency of vehicular traffic and public safety. However, the stringent packet end-to-end (E2E) latency and ultra-low transmission failure rates pose challenging requirements on the legacy networks. In other words, the next generation wireless network needs to support ultra-reliable low latency communications (URLLC) involving new key performance indicators (KPIs) rather than the conventional metric, such as cell throughput in the legacy systems. In this paper, a feasibility study on applying today's LTE network infrastructure and LTE-Uu air interface to provide the URLLC type of services is performed, where the communication takes place between two traffic participants (e.g., vehicle-to-vehicle and vehicle-to-pedestrian). To carry out this study, an evaluation methodology of the cellular vehicle-to-anything (V2X) communication is proposed, where packet E2E latency and successful transmission rate are considered as the key performance indicators (KPIs). Then, we describe the simulation assumptions for the evaluation. Based on them, simulation results are depicted that demonstrate the performance of the LTE network in fulfilling new URLLC requirements. Moreover, sensitivity analysis is also conducted regarding how to further improve system performance, in order to enable new emerging URLLC services.Comment: Accepted by IEEE/CIC ICCC 201
    • 

    corecore