2,769 research outputs found

    Scheduling Policies in Time and Frequency Domains for LTE Downlink Channel: A Performance Comparison

    Get PDF
    A key feature of the Long-Term Evolution (LTE) system is that the packet scheduler can make use of the channel quality information (CQI), which is periodically reported by user equipment either in an aggregate form for the whole downlink channel or distinguished for each available subchannel. This mechanism allows for wide discretion in resource allocation, thus promoting the flourishing of several scheduling algorithms, with different purposes. It is therefore of great interest to compare the performance of such algorithms under different scenarios. Here, we carry out a thorough performance analysis of different scheduling algorithms for saturated User Datagram Protocol (UDP) and Transmission Control Protocol (TCP) traffic sources, as well as consider both the time- and frequency-domain versions of the schedulers and for both flat and frequency-selective channels. The analysis makes it possible to appreciate the difference among the scheduling algorithms and to assess the performance gain, in terms of cell capacity, users' fairness, and packet service time, obtained by exploiting the richer, but heavier, information carried by subchannel CQI. An important part of this analysis is a throughput guarantee scheduler, which we propose in this paper. The analysis reveals that the proposed scheduler provides a good tradeoff between cell capacity and fairness both for TCP and UDP traffic sources

    A low complexity resource allocation algorithm for multicast service delivery in OFDMA networks

    Get PDF
    Allocating and managing radio resources to multicast transmissions in Orthogonal Frequency-Division Multiple Access (OFDMA) systems is the challenging research issue addressed by this paper. A subgrouping technique, which divides the subscribers into subgroups according to the experienced channel quality, is considered to overcome the throughput limitations of conventional multicast data delivery schemes. A low complexity algorithm, designed to work with different resource allocation strategies, is also proposed to reduce the computational complexity of the subgroup formation problem. Simulation results, carried out by considering the Long Term Evolution (LTE) system based on OFDMA, testify the effectiveness of the proposed solution, which achieves a near-optimal performance with a limited computational load for the system

    Interference model and antenna parameters setting effects on 4G-LTE networks coverage

    No full text
    International audienceThe currently emerging Long Term Evolution 4G-LTE cellular networks are based on new technique of transmission called the Orthogonal Frequency Division Multiple Access (OFDMA). This paper shows the interest of robust approach due to the uncertainty of traffic distribution. First, we develop and validate the interference model based on SINR metric for the deployment of the LTE network, and then we use greedy algorithms to show how frequency and tilt parameter settings can impact the coverage performance metric. Two frequency schemes have been compared to validate our model: the frequency reuse 1 scheme whereby the whole available bandwidth is used in each cell/sector and the frequency reuse 3 scheme in which the entire bandwidth is divided into 3 non-overlapping groups and assigned to 3 co-site sectors within each cell

    Enabling RAN Slicing Through Carrier Aggregation in mmWave Cellular Networks

    Full text link
    The ever increasing number of connected devices and of new and heterogeneous mobile use cases implies that 5G cellular systems will face demanding technical challenges. For example, Ultra-Reliable Low-Latency Communication (URLLC) and enhanced Mobile Broadband (eMBB) scenarios present orthogonal Quality of Service (QoS) requirements that 5G aims to satisfy with a unified Radio Access Network (RAN) design. Network slicing and mmWave communications have been identified as possible enablers for 5G. They provide, respectively, the necessary scalability and flexibility to adapt the network to each specific use case environment, and low latency and multi-gigabit-per-second wireless links, which tap into a vast, currently unused portion of the spectrum. The optimization and integration of these technologies is still an open research challenge, which requires innovations at different layers of the protocol stack. This paper proposes to combine them in a RAN slicing framework for mmWaves, based on carrier aggregation. Notably, we introduce MilliSlice, a cross-carrier scheduling policy that exploits the diversity of the carriers and maximizes their utilization, thus simultaneously guaranteeing high throughput for the eMBB slices and low latency and high reliability for the URLLC flows.Comment: 8 pages, 8 figures. Proc. of the 18th Mediterranean Communication and Computer Networking Conference (MedComNet 2020), Arona, Italy, 202
    • …
    corecore