596 research outputs found

    Review on Radio Resource Allocation Optimization in LTE/LTE-Advanced using Game Theory

    Get PDF
    Recently, there has been a growing trend toward ap-plying game theory (GT) to various engineering fields in order to solve optimization problems with different competing entities/con-tributors/players. Researches in the fourth generation (4G) wireless network field also exploited this advanced theory to overcome long term evolution (LTE) challenges such as resource allocation, which is one of the most important research topics. In fact, an efficient de-sign of resource allocation schemes is the key to higher performance. However, the standard does not specify the optimization approach to execute the radio resource management and therefore it was left open for studies. This paper presents a survey of the existing game theory based solution for 4G-LTE radio resource allocation problem and its optimization

    LTE Optimization and Resource Management in Wireless Heterogeneous Networks

    Get PDF
    Mobile communication technology is evolving with a great pace. The development of the Long Term Evolution (LTE) mobile system by 3GPP is one of the milestones in this direction. This work highlights a few areas in the LTE radio access network where the proposed innovative mechanisms can substantially improve overall LTE system performance. In order to further extend the capacity of LTE networks, an integration with the non-3GPP networks (e.g., WLAN, WiMAX etc.) is also proposed in this work. Moreover, it is discussed how bandwidth resources should be managed in such heterogeneous networks. The work has purposed a comprehensive system architecture as an overlay of the 3GPP defined SAE architecture, effective resource management mechanisms as well as a Linear Programming based analytical solution for the optimal network resource allocation problem. In addition, alternative computationally efficient heuristic based algorithms have also been designed to achieve near-optimal performance

    Towards UAV Assisted 5G Public Safety Network

    Get PDF
    Ensuring ubiquitous mission-critical public safety communications (PSC) to all the first responders in the public safety network is crucial at an emergency site. The first responders heavily rely on mission-critical PSC to save lives, property, and national infrastructure during a natural or human-made emergency. The recent advancements in LTE/LTE-Advanced/5G mobile technologies supported by unmanned aerial vehicles (UAV) have great potential to revolutionize PSC. However, limited spectrum allocation for LTE-based PSC demands improved channel capacity and spectral efficiency. An additional challenge in designing an LTE-based PSC network is achieving at least 95% coverage of the geographical area and human population with broadband rates. The coverage requirement and efficient spectrum use in the PSC network can be realized through the dense deployment of small cells (both terrestrial and aerial). However, there are several challenges with the dense deployment of small cells in an air-ground heterogeneous network (AG-HetNet). The main challenges which are addressed in this research work are integrating UAVs as both aerial user and aerial base-stations, mitigating inter-cell interference, capacity and coverage enhancements, and optimizing deployment locations of aerial base-stations. First, LTE signals were investigated using NS-3 simulation and software-defined radio experiment to gain knowledge on the quality of service experienced by the user equipment (UE). Using this understanding, a two-tier LTE-Advanced AG-HetNet with macro base-stations and unmanned aerial base-stations (UABS) is designed, while considering time-domain inter-cell interference coordination techniques. We maximize the capacity of this AG-HetNet in case of a damaged PSC infrastructure by jointly optimizing the inter-cell interference parameters and UABS locations using a meta-heuristic genetic algorithm (GA) and the brute-force technique. Finally, considering the latest specifications in 3GPP, a more realistic three-tier LTE-Advanced AG-HetNet is proposed with macro base-stations, pico base-stations, and ground UEs as terrestrial nodes and UABS and aerial UEs as aerial nodes. Using meta-heuristic techniques such as GA and elitist harmony search algorithm based on the GA, the critical network elements such as energy efficiency, inter-cell interference parameters, and UABS locations are all jointly optimized to maximize the capacity and coverage of the AG-HetNet

    Long Term Evolution-Advanced and Future Machine-to-Machine Communication

    Get PDF
    Long Term Evolution (LTE) has adopted Orthogonal Frequency Division Multiple Access (OFDMA) and Single Carrier Frequency Division Multiple Access (SC-FDMA) as the downlink and uplink transmission schemes respectively. Quality of Service (QoS) provisioning is one of the primary objectives of wireless network operators. In LTE-Advanced (LTE-A), several additional new features such as Carrier Aggregation (CA) and Relay Nodes (RNs) have been introduced by the 3rd Generation Partnership Project (3GPP). These features have been designed to deal with the ever increasing demands for higher data rates and spectral efficiency. The RN is a low power and low cost device designed for extending the coverage and enhancing spectral efficiency, especially at the cell edge. Wireless networks are facing a new challenge emerging on the horizon, the expected surge of the Machine-to-Machine (M2M) traffic in cellular and mobile networks. The costs and sizes of the M2M devices with integrated sensors, network interfaces and enhanced power capabilities have decreased significantly in recent years. Therefore, it is anticipated that M2M devices might outnumber conventional mobile devices in the near future. 3GPP standards like LTE-A have primarily been developed for broadband data services with mobility support. However, M2M applications are mostly based on narrowband traffic. These standards may not achieve overall spectrum and cost efficiency if they are utilized for serving the M2M applications. The main goal of this thesis is to take the advantage of the low cost, low power and small size of RNs for integrating M2M traffic into LTE-A networks. A new RN design is presented for aggregating and multiplexing M2M traffic at the RN before transmission over the air interface (Un interface) to the base station called eNodeB. The data packets of the M2M devices are sent to the RN over the Uu interface. Packets from different devices are aggregated at the Packet Data Convergence Protocol (PDCP) layer of the Donor eNodeB (DeNB) into a single large IP packet instead of several small IP packets. Therefore, the amount of overhead data can be significantly reduced. The proposed concept has been developed in the LTE-A network simulator to illustrate the benefits and advantages of the M2M traffic aggregation and multiplexing at the RN. The potential gains of RNs such as coverage enhancement, multiplexing gain, end-to-end delay performance etc. are illustrated with help of simulation results. The results indicate that the proposed concept improves the performance of the LTE-A network with M2M traffic. The adverse impact of M2M traffic on regular LTE-A traffic such as voice and file transfer is minimized. Furthermore, the cell edge throughput and QoS performance are enhanced. Moreover, the results are validated with the help of an analytical model

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial

    PROCESS FOR BREAKING DOWN THE LTE SIGNAL TO EXTRACT KEY INFORMATION

    Get PDF
    The increasingly important role of Long Term Evolution (LTE) has increased security concerns among the service providers and end users and made security of the network even more indispensable. The main thrust of this thesis is to investigate if the LTE signal can be broken down in a methodical way to obtain information that would otherwise be private; e.g., the Global Positioning System (GPS) location of the user equipment/base station or identity (ID) of the user. The study made use of signal simulators and software to analyze the LTE signal to develop a method to remove noise, breakdown the LTE signal and extract desired information. From the simulation results, it was possible to extract key information in the downlink like the Downlink Control Information (DCI), Cell-Radio Network Temporary Identifier (C-RNTI) and physical Cell Identity (Cell-ID). This information can be modified to cause service disruptions in the network within a reasonable amount of time and with modest computing resources.Defence Science and Technology Agency, SingaporeApproved for public release; distribution is unlimited

    LTE Advanced: Technology and Performance Analysis

    Get PDF
    Wireless data usage is increasing at a phenomenal rate and driving the need for continued innovations in wireless data technologies to provide more capacity and higher quality of service. In October 2009, 3rd Generation Partnership Project (3GPP) submitted LTE-Advanced to the ITU as a proposed candidate IMT-Advanced technology for which specifications could become available in 2011 through Release-10 . The aim of “LTE-Advanced” is to further enhance LTE radio access in terms of system performance and capabilities compared to current cellular systems, including the first release of LTE, with a specific goal to ensure that LTE fulfills and even surpass the requirements of “IMT-Advanced” as defined by the International Telecommunication Union (ITU-R) . This thesis offers an introduction to the mobile communication standard known as LTE Advanced, depicting the evolution of the standard from its roots and discussing several important technologies that help it evolve to accomplishing the IMT-Advanced requirements. A short history of the LTE standard is offered, along with a discussion of its standards and performance. LTE-Advanced details include analysis on the physical layer by investigating the performance of SC-FDMA and OFDMA of LTE physical layer. The investigation is done by considering different modulation schemes (QPSK, 16QAM and 64QAM) on the basis of PAPR, BER, power spectral density (PSD) and error probability by simulating the model of SC-FDMA & OFDMA. To evaluate the performance in presence of noise, an Additive White Gaussian Noise (AWGN) channel was introduced. A set of conclusions is derived from our results describing the effect of higher order modulation schemes on BER and error probability for both OFDMA and SC-FDMA. The power spectral densities of both the multiple access techniques (OFDMA and SC-FDMA) are calculated and result shows that the OFDMA has higher power spectral density.fi=OpinnĂ€ytetyö kokotekstinĂ€ PDF-muodossa.|en=Thesis fulltext in PDF format.|sv=LĂ€rdomsprov tillgĂ€ngligt som fulltext i PDF-format

    Optimizations in Heterogeneous Mobile Networks

    Get PDF

    Adaptive multimedia streaming control algorithm in wireless LANs and 4G networks

    Get PDF
    E-learning has become an important service offered over the Internet. Lately many users are accessing learning content via wireless networks and using mobile devices. Most content is rich media-based and often puts significant pressure on the existing wireless networks in order to support high quality of delivery. In this context, offering a solution for improving user quality of experience when multimedia content is delivered over wireless networks is already a challenging task. Additionally, to support this for mobile e-learning over wireless LANs becomes even more difficult. If we want to increase the end-used perceived quality, we have to take into account the users’ individual set of characteristics. The fact that users have subjective opinions on the quality of a multimedia application can be used to increase their QoE by setting a minimum quality threshold below which the connection is considered to be undesired. Like this, the use of precious radio resources can be optimized in order to simultaneously satisfy an increased number of users. In this thesis a new user-oriented adaptive algorithm based on QOAS was designed and developed in order to address the user satisfaction problem. Simulations have been carried out with different adaptation schemes to compare the performances and benefits of the DQOAS mechanism. The simulation results are showing that using a dynamic stream granularity with a minimum threshold for the transmission rate, improves the overall quality of the multimedia delivery process, increasing the total number of satisfied users and the link utilization The good results obtained by the algorithm in IEEE 802.11 wireless environment, motivated the research about the utility of the newly proposed algorithm in another wireless environment, LTE. The study shows that DQOAS algorithm can obtain good results in terms of application perceived quality, when the considered application generates multiple streams. These results can be improved by using a new QoS parameters mapping scheme able to modify the streams’ priority and thus allowing the algorithms decisions to not be overridden by the systems’ scheduler
    • 

    corecore