359 research outputs found

    Performance evaluation of video streaming on LTE with coexistence of Wi-Fi signal

    Get PDF
    The continuous growth in mobile data traffic and limited license wireless spectrum have led to dramatically increase the demand of the radio spectrum. It is widespread the concern about the coexistence of long term evolution (LTE) and Wi-Fi in the unlicensed band. There are several techniques have been proposed to enable the coexistence of LTE and Wi-Fi in the unlicensed band, but these works are targeted on the impact of the LTE to the Wi-Fi network performance. An experiment is carried out in this work to evaluate the impact of Wi-Fi signal on the video streaming in the LTE network. The experimental test comprised of the national instrument (NI) universal software radio peripheral (USRP) 2953R that is controlled by the LabVIEW Communication LTE application framework. Extensiveexperiments are carried out under two scenarios, i.e. (1) Coexistence of LTE and Wi-Fi signal, (2) LTE signal only. Performance evaluations are carried out with different Modulation and coding schemes (MCS) values and different mode of operations, i.e. frequency division duplex (FDD) and time division duplex (TDD) mode. The results illustrated that the interference from Wi-Fi signal caused the performance degradation of the LTE network in throughput and the power received by user equipment (UE)

    A Flexible 5G Frame Structure Design for Frequency-Division Duplex Cases

    Get PDF

    Optimize Power Allocation Scheme to Maximize Sum Rate in CoMP with Limited Channel State Information

    Get PDF
    Extensive use of mobile applications throws many challenges in cellular systems like cell edge throughput, inter cell interference and spectral e�ciency. Many of these challenges have been resolved using Coordinated Multi-Point (CoMP), developed in the Third Generation Partnership Project for LTE-Advanced) to a great extent. CoMP cooperatively process signals from base sta- tions that are connected to various multiple terminals (user equipment (UEs)) at transmission and reception. This CoMP improves throughput, reduces or even removes inter-cell interference and increases spectral e�ciency in the downlink of multi-antenna coordinated multipoint systems. Many researchers addressed these issues assuming that BSs have the knowledge of the common control channels dedicated to all UEs and also about the full or partial channel state information (CSI) of all the links. From the CSI available at the BSs, multiuser interference can be managed at the BSs. To make this feasible, UEs are responsible for collecting downlink CSI. But, CSI measurement (instantaneous and/or statistical) is imperfect in nature because of the randomly varying nature of the channels at random times. These incorrect CSI values available at the BSs may, in turn, create multi-user interference. There are many techniques to suppress the multi-user interference, among which the feedback scheme is the one which is gaining a lot of attention. In feedback schemes, CSI information needs to be fed back to the base station from UEs in the uplink. It is obvious, the question arises on the type and amount of feedback need to be used. Research has been progressing in this front and some feedback techniques have been proposed. Three basic CoMP Feedback schemes are available. Explicit or statistical channel information feedback scheme in which channel information like channels's covariance matrix of the channel are shared between the transmitter and receiver. Next, implicit or statistical channel information feedback which contains information such as Channel quality indication or Precoding matrix indicator or Rank indicator. 1st applied to TDD LTE type structure and 2nd of feedback scheme can be applied in the FDD system. Finally, we have UE which tranmit the sounding reference signal (CSI). This type of feedback scheme is applied to exploit channel reciprocity and to reduce channel intercell interference and this can be applied in the TDD system. We have analyzed the scenario of LTE TDD based system. After this, optimization of power is also required because users at the cell edge required more attention than the user locating at the center of the cell. In my work, it shows estimated power gives exponential divercity for high SNR as low SNR too. In this method, a compression feedback method is analyzed to provide multi-cell spatial channel information. It improves the feedback e�ciency and throughput. The rows and columns of the channel matrix are compressed using Eigenmode of the user and codebook based scheme speci�ed in LTE speci�cation. The main drawback of this scheme is that spectral e�ciency is achieved with the cost of increased overheads for feedback and evolved NodeB (eNB). Other factor is complexity of eNodeB which is to be addressed in future work

    Packet Scheduling Algorithms in LTE/LTE-A cellular Networks: Multi-agent Q-learning Approach

    Get PDF
    Spectrum utilization is vital for mobile operators. It ensures an efficient use of spectrum bands, especially when obtaining their license is highly expensive. Long Term Evolution (LTE), and LTE-Advanced (LTE-A) spectrum bands license were auctioned by the Federal Communication Commission (FCC) to mobile operators with hundreds of millions of dollars. In the first part of this dissertation, we study, analyze, and compare the QoS performance of QoS-aware/Channel-aware packet scheduling algorithms while using CA over LTE, and LTE-A heterogeneous cellular networks. This included a detailed study of the LTE/LTE-A cellular network and its features, and the modification of an open source LTE simulator in order to perform these QoS performance tests. In the second part of this dissertation, we aim to solve spectrum underutilization by proposing, implementing, and testing two novel multi-agent Q-learning-based packet scheduling algorithms for LTE cellular network. The Collaborative Competitive scheduling algorithm, and the Competitive Competitive scheduling algorithm. These algorithms schedule licensed users over the available radio resources and un-licensed users over spectrum holes. In conclusion, our results show that the spectrum band could be utilized by deploying efficient packet scheduling algorithms for licensed users, and can be further utilized by allowing unlicensed users to be scheduled on spectrum holes whenever they occur

    LTE Advanced: Technology and Performance Analysis

    Get PDF
    Wireless data usage is increasing at a phenomenal rate and driving the need for continued innovations in wireless data technologies to provide more capacity and higher quality of service. In October 2009, 3rd Generation Partnership Project (3GPP) submitted LTE-Advanced to the ITU as a proposed candidate IMT-Advanced technology for which specifications could become available in 2011 through Release-10 . The aim of “LTE-Advanced” is to further enhance LTE radio access in terms of system performance and capabilities compared to current cellular systems, including the first release of LTE, with a specific goal to ensure that LTE fulfills and even surpass the requirements of “IMT-Advanced” as defined by the International Telecommunication Union (ITU-R) . This thesis offers an introduction to the mobile communication standard known as LTE Advanced, depicting the evolution of the standard from its roots and discussing several important technologies that help it evolve to accomplishing the IMT-Advanced requirements. A short history of the LTE standard is offered, along with a discussion of its standards and performance. LTE-Advanced details include analysis on the physical layer by investigating the performance of SC-FDMA and OFDMA of LTE physical layer. The investigation is done by considering different modulation schemes (QPSK, 16QAM and 64QAM) on the basis of PAPR, BER, power spectral density (PSD) and error probability by simulating the model of SC-FDMA & OFDMA. To evaluate the performance in presence of noise, an Additive White Gaussian Noise (AWGN) channel was introduced. A set of conclusions is derived from our results describing the effect of higher order modulation schemes on BER and error probability for both OFDMA and SC-FDMA. The power spectral densities of both the multiple access techniques (OFDMA and SC-FDMA) are calculated and result shows that the OFDMA has higher power spectral density.fi=Opinnäytetyö kokotekstinä PDF-muodossa.|en=Thesis fulltext in PDF format.|sv=Lärdomsprov tillgängligt som fulltext i PDF-format

    DESIGN AND DEVELOPMENT OF CARRIER ASSIGNMENT AND PACKET SCHEDULING IN LTE-A AND Wi-Fi

    Get PDF
    The highly competitive environment in today's wireless and cellular network industries is making the management of systems seek for better and more advance techniques to keep masses of data, complexity of systems and deadline constrains under control with a lower cost and higher efficiency. Therefore, the management is getting significant attentions by researchers in order to increase the efficiency of the resource usage to provide high quality services. Two of the cornerstones of the management system in wireless and cellular network are carrier assignment and packet scheduling. Therefore, this work focuses on analysis and development of carrier assignment and packet scheduling methods in multi-band Wi-Fi and LTE-A networks. First, several existing carrier assignment methods which are developed by considering different strategists in LTE and LTE-A are analyzed. Secondly, a new technique for the carrier assignment methods for LTE and LTE-A is developed to improve the efficiency of carrier assignment methods. Thirdly, a novel carrier assignment method is proposed by considering the behaviors of mobile users for LTE and LTE-A. Then, a novel architecture with packet scheduling scheme is proposed for next generation mobile routers in multi-band Wi-Fi environment as similar to LTE-A. Finally, the scheme is improved based on energy awareness. Results show that the developed methods improve the performance of the systems in comparison to existing methods. The proposed methods and related analysis should help network engineers and service providers build next generation carrier assignment and packet scheduling methods to satisfy users in LTE, LTE-A and Wi-Fi

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial

    Packet scheduling algorithms in LTE systems

    Full text link
    University of Technology Sydney. Faculty of Engineering and Information Technology.There has been a huge increase in demand towards improving the Quality of Service (QoS) of wireless services. Long Term Evolution (LTE) is a development of the Third-Generation Partnership Project (3GPP) with the aim to meet the needs of International Telecommunication Union (ITU). Some of its aspects are highlighted as follows: increase in data rate, scalable bandwidth, reduced latency and increase in coverage and capacity that result in better quality of service in communication. LTE employs Orthogonal Frequency Division Multiple Access (OFDMA) to simultaneously deliver multimedia services at a high speed rate. Packet switching is used by LTE to support different media services. To meet the QoS requirements for LTE networks, packet scheduling has been employed. Packet scheduling decides when and how different packets are delivered to the receiver. It is responsible for smart user packet selection to allocate radio resources appropriately. Therefore, packet scheduling should be cleverly designed to achieve QoS that is similar to fixed line services. eNodeB is a node in LTE network which is responsible for radio resource management that involves packet scheduling. There are two main categories of application in multimedia services: RT (Real Time) and NRT (None Real Time) services. RT services are either delay sensitive (e.g. voice over IP), loss sensitive (e.g. Buffered Video) or both (delay &loss sensitive) for example video conferencing. Best effort users are an example of NRT services that do not have exact requisites and have been allocated to spare resources. Reaching higher throughput has sometimes resulted in unfair allocation to users who are located far from the base station or users who suffer from bad channel conditions. Therefore, a sufficient trade-off between throughput and fairness is essential. The scarce bandwidth, fading radio channels and the QoS requirement of the users, makes resource allocation a demanding issue. Different scheduling approaches have been suggested for different service demands described briefly throughout the thesis. Initially, a comprehensive literature review of existing work on the packet scheduling topic has been accomplished in this thesis to realize the characteristics of packet scheduling and the resource allocation for the wireless network. Many packet scheduling algorithms developed to provide satisfactory QoS for multimedia services in downlink LTE systems. Several algorithms considered in this thesis include time and frequency domain algorithms and their way of approach has been investigated. The next objective of this thesis is to improve the performance of packet scheduling in LTE downlink systems. A new packet scheduling algorithm has been introduced in this thesis. A study on VoLTE (Voice over LTE), video streaming and best effort traffic under three different scheduling algorithms has been conducted. Heterogeneous traffic based on precise modelling of packets has been used in the simulation. The main resource allocation and assignment technique used in this work namely Dynamic Subcarrier Allocation scheme is shown to provide a solution to solve the cross layer optimisation problem. It depends on Channel Quality Information (CQI) and has been broadly investigated for single carrier and multicarrier wireless networks. The problem is based on the maximisation of average utility functions. Different scheduling algorithms in this method consider to be utility functions. The throughput, fairness and Packet Loss Ratio have been considered as the requirements for examining the performance of algorithms. Simulation results show that the proposed algorithm significantly increases the performance of streaming and best effort users in terms of PLR and throughput. Fairness has also been improved with less computational complexity compared to previous algorithms that have been introduced in this thesis

    Design And Analysis Of Modified-Proportional Fair Scheduler For LTELTE-Advanced

    Get PDF
    Nowadays, Long Term Evolution-Advanced (LTE-Advanced) is well known as a cellular network that can support very high data rates in diverse traffic conditions. One of the key components of Orthogonal Frequency-Division Multiple Access (OFDMA), Radio Resource Management (RRM), is critical in achieving the desired performance by managing key components of both PHY and MAC layers. The technique that can be done to achieve this is through packet scheduling which is the key scheme of RRM for LTE traffic processing whose function is to allocate resources for both frequency and time dimensions. Packet scheduling for LTE-Advanced has been a dynamic research area in recent years, because in evidence, the increasing demands of data services and number of users which is likely to explode the progress of the LTE system traffic. However, the existing scheduling system is increasingly congested with the increasing number of users and requires the new scheduling system to ensure a more efficient data transmission. In LTE system, Round Robin (RR) scheduler has a problem in providing a high data rate to User Equipment’s (UEs). This is because some resources will be wasted because it schedules the resources from/ to UEs while the UEs are suffering from severe deep fading and less than the required threshold. Meanwhile, for Proportional Fair (PF) scheduler, the process of maximizing scheme of data rate could be very unfair and UE that experienced a bad channel quality conditions can be starved. So, the mechanism applied in PF scheduler is to weight the current data rate achievable by a UE by the average rate received by a UE. The main contribution of this study is the design of a new scheduling scheme and its performance is compared with the PF and RR downlink schedulers for LTE by utilizing the LTE Downlink System Level Simulator. The proposed new scheduling algorithm, namely the Modified-PF scheduler, divides a single sub-frame into multiple time slots and allocates the resource block (RB) to the targeted UE in all time slots for each sub-frame based on the instantaneous Channel Quality Indicator (CQI) feedback received from UEs. Besides, the proposed scheduler is also capable to reallocate RB cyclically in turn to target UE within a time slot in order to ensure the process of distributing packet data consistently. The simulation results showed that the Modified-PF scheduler provided the best performance in terms of throughput in the range of up to 90% improvement and almost 40% increment for spectral efficiency with comparable fairness as compared to PF and RR schedulers. Although PF scheduler had the best fairness index, the Modified-PF scheduler provided a better compromise between the throughput in /spectral efficiency and fairness. This showed that the newly proposed scheme improved the LTE output performances while at the same time maintained a minimal required fairness among the UEs
    corecore