4 research outputs found

    Radio Resource Management in LTE-Advanced Systems with Carrier Aggregation

    Get PDF
    In order to meet the ever-increasing demand for wireless broadband services from fast growing mobile users, the Long Term Evolution -Advanced (LTE-A) standard has been proposed to effectively improve the system capacity and the spectral efficiency for the fourth-generation (4G) wireless mobile communications. Many advanced techniques are incorporated in LTE-A systems to jointly ameliorate system performance, among which Carrier Aggregation (CA) is considered as one of the most promising improvements that has profound significance even in the upcoming 5G era. Component carriers (CCs) from various portions of the spectrum are logically concatenated to form a much larger virtual band, resulting in remarkable boosted system capacity and user data throughput. However, the unique features of CA have posed many emerging challenges as well as span-new opportunities on the Radio Resource Management (RRM) in the LTE-A systems. First, although multi-CC transmission can bring higher throughput, it may incur more intensive interference for each CC and more power consumption for users. Thus the performance gain of CA under different conditions needs fully evaluating. Besides, as CA offers flexible CC selection and cross-CC load balancing and scheduling, enhanced RRM strategies should be designed to further optimize the overall resource utilization. In addition, CA enables the frequency reuse on a CC resolution, adding another dimension to inter-cell interference management in heterogeneous networks (HetNets). New interference management mechanisms should be designed to take the advantage of CA. Last but not least, CA empowers the LTE-A systems to aggregate the licensed spectrum with the unlicensed spectrum, thus offering a capacity surge. Yet how to balance the traffic between licensed and unlicensed spectrum and how to achieve a harmony coexistence with other unlicensed systems are still open issues. To this end, the dissertation emphasizes on the new functionalities introduced by CA to optimize the RRM performance in LTE-A systems. The main objectives are four-fold: 1) to fully evaluate the benefits of CA from different perspectives under different conditions via both theoretical analysis and simulations; 2) to design cross-layer CC selection, packet scheduling and power control strategies to optimize the target performance; 3) to analytically model the interference of HetNets with CA and propose dynamic interference mitigation strategies in a CA scenario; and 4) to investigate the impact of LTE transmissions on other unlicensed systems and develop enhanced RRM mechanisms for harmony coexistence. To achieve these objectives, we first analyze the benefits of CA via investigating the user accommodation capabilities of the system in the downlink admission control process. The LTE-A users with CA capabilities and the legacy LTE users are considered. Analytical models are developed to derive the maximum number of users that can be admitted into the system given the user QoS requirements and traffic features. The results show that with only a slightly higher spectrum utilization, the system can admit as much as twice LTE-A users than LTE users when the user traffic is bursty. Second, we study the RRM in the single-tier LTE-A system and propose a cross-layer dynamic CC selection and power control strategy for uplink CA. Specifically, the uplink power offset effects caused by multi-CC transmission are considered. An estimation method for user bandwidth allocation is developed and a combinatorial optimization problem is formulated to improve the user throughput via maximizing the user power utilization. Third, we explore the interference management problem in multi-tier HetNets considering the CC-resolution frequency reuse. An analytical model is devised to capture the randomness behaviors of the femtocells exploiting the stochastic geometry theory. The interaction between the base stations of different tiers are formulated into a two-level Stackelberg game, and a backward induction method is exploited to obtain the Nash equilibrium. Last, we focus on the mechanism design for licensed and unlicensed spectrum aggregation. An LTE MAC protocol on unlicensed spectrum is developed considering the coexistence with the Wi-Fi systems. The protocol captures the asynchronous nature of Wi-Fi transmissions in time-slotted LTE frame structure and strike a tunable tradeoff between LTE and Wi-Fi performance. Analytical analysis is also presented to reveal the essential relation among different parameters of the two systems. In summary, the dissertation aims at fully evaluating the benefits of CA in different scenarios and making full use of the benefits to develop efficient and effective RRM strategies for better LTE-Advanced system performance

    AN EFFICIENT INTERFERENCE AVOIDANCE SCHEME FOR DEVICE-TODEVICE ENABLED FIFTH GENERATION NARROWBAND INTERNET OF THINGS NETWOKS’

    Get PDF
    Narrowband Internet of Things (NB-IoT) is a low-power wide-area (LPWA) technology built on long-term evolution (LTE) functionalities and standardized by the 3rd-Generation Partnership Project (3GPP). Due to its support for massive machine-type communication (mMTC) and different IoT use cases with rigorous standards in terms of connection, energy efficiency, reachability, reliability, and latency, NB-IoT has attracted the research community. However, as the capacity needs for various IoT use cases expand, the LTE evolved packet core (EPC) system's numerous functionalities may become overburdened and suboptimal. Several research efforts are currently in progress to address these challenges. As a result, an overview of these efforts with a specific focus on the optimized architecture of the LTE EPC functionalities, the 5G architectural design for NB-IoT integration, the enabling technologies necessary for 5G NB-IoT, 5G new radio (NR) coexistence with NB-IoT, and feasible architectural deployment schemes of NB-IoT with cellular networks is discussed. This thesis also presents cloud-assisted relay with backscatter communication as part of a detailed study of the technical performance attributes and channel communication characteristics from the physical (PHY) and medium access control (MAC) layers of the NB-IoT, with a focus on 5G. The numerous drawbacks that come with simulating these systems are explored. The enabling market for NB-IoT, the benefits for a few use cases, and the potential critical challenges associated with their deployment are all highlighted. Fortunately, the cyclic prefix orthogonal frequency division multiplexing (CPOFDM) based waveform by 3GPP NR for improved mobile broadband (eMBB) services does not prohibit the use of other waveforms in other services, such as the NB-IoT service for mMTC. As a result, the coexistence of 5G NR and NB-IoT must be manageably orthogonal (or quasi-orthogonal) to minimize mutual interference that limits the form of freedom in the waveform's overall design. As a result, 5G coexistence with NB-IoT will introduce a new interference challenge, distinct from that of the legacy network, even though the NR's coexistence with NB-IoT is believed to improve network capacity and expand the coverage of the user data rate, as well as improves robust communication through frequency reuse. Interference challenges may make channel estimation difficult for NB-IoT devices, limiting the user performance and spectral efficiency. Various existing interference mitigation solutions either add to the network's overhead, computational complexity and delay or are hampered by low data rate and coverage. These algorithms are unsuitable for an NB-IoT network owing to the low-complexity nature. As a result, a D2D communication based interference-control technique becomes an effective strategy for addressing this problem. This thesis used D2D communication to decrease the network bottleneck in dense 5G NBIoT networks prone to interference. For D2D-enabled 5G NB-IoT systems, the thesis presents an interference-avoidance resource allocation that considers the less favourable cell edge NUEs. To simplify the algorithm's computing complexity and reduce interference power, the system divides the optimization problem into three sub-problems. First, in an orthogonal deployment technique using channel state information (CSI), the channel gain factor is leveraged by selecting a probable reuse channel with higher QoS control. Second, a bisection search approach is used to find the best power control that maximizes the network sum rate, and third, the Hungarian algorithm is used to build a maximum bipartite matching strategy to choose the optimal pairing pattern between the sets of NUEs and the D2D pairs. The proposed approach improves the D2D sum rate and overall network SINR of the 5G NB-IoT system, according to the numerical data. The maximum power constraint of the D2D pair, D2D's location, Pico-base station (PBS) cell radius, number of potential reuse channels, and cluster distance impact the D2D pair's performance. The simulation results achieve 28.35%, 31.33%, and 39% SINR performance higher than the ARSAD, DCORA, and RRA algorithms when the number of NUEs is twice the number of D2D pairs, and 2.52%, 14.80%, and 39.89% SINR performance higher than the ARSAD, RRA, and DCORA when the number of NUEs and D2D pairs are equal. As a result, a D2D sum rate increase of 9.23%, 11.26%, and 13.92% higher than the ARSAD, DCORA, and RRA when the NUE’s number is twice the number of D2D pairs, and a D2D’s sum rate increase of 1.18%, 4.64% and 15.93% higher than the ARSAD, RRA and DCORA respectively, with an equal number of NUEs and D2D pairs is achieved. The results demonstrate the efficacy of the proposed scheme. The thesis also addressed the problem where the cell-edge NUE's QoS is critical to challenges such as long-distance transmission, delays, low bandwidth utilization, and high system overhead that affect 5G NB-IoT network performance. In this case, most cell-edge NUEs boost their transmit power to maximize network throughput. Integrating cooperating D2D relaying technique into 5G NB-IoT heterogeneous network (HetNet) uplink spectrum sharing increases the system's spectral efficiency and interference power, further degrading the network. Using a max-max SINR (Max-SINR) approach, this thesis proposed an interference-aware D2D relaying strategy for 5G NB-IoT QoS improvement for a cell-edge NUE to achieve optimum system performance. The Lagrangian-dual technique is used to optimize the transmit power of the cell-edge NUE to the relay based on the average interference power constraint, while the relay to the NB-IoT base station (NBS) employs a fixed transmit power. To choose an optimal D2D relay node, the channel-to-interference plus noise ratio (CINR) of all available D2D relays is used to maximize the minimum cell-edge NUE's data rate while ensuring the cellular NUEs' QoS requirements are satisfied. Best harmonic mean, best-worst, half-duplex relay selection, and a D2D communication scheme were among the other relaying selection strategies studied. The simulation results reveal that the Max-SINR selection scheme outperforms all other selection schemes due to the high channel gain between the two communication devices except for the D2D communication scheme. The proposed algorithm achieves 21.27% SINR performance, which is nearly identical to the half-duplex scheme, but outperforms the best-worst and harmonic selection techniques by 81.27% and 40.29%, respectively. As a result, as the number of D2D relays increases, the capacity increases by 14.10% and 47.19%, respectively, over harmonic and half-duplex techniques. Finally, the thesis presents future research works on interference control in addition with the open research directions on PHY and MAC properties and a SWOT (Strengths, Weaknesses, Opportunities, and Threats) analysis presented in Chapter 2 to encourage further study on 5G NB-IoT

    Heterogeneous Wireless Networks QoE Framework

    Get PDF
    With the appearance of small cells and the move of mobile networks towards an all-IP 4G network, the convergence of these with Wi-Fi becomes a possibility which at the same time opens the path to achieve what will become 5G connectivity. This thesis describes the evolution of the different mainstream wireless technologies deployed around the world and how they can interact, and provides tools to use this convergence to achieve the foreseen requirements expected in a 5G environment and the ideal user experience. Several topics were identified as needing attention: handover between heterogeneous networks, security of large numbers of small cells connected via a variety of backhaul technologies to the core networks, edge content distribution to improve latency, improvement of the service provided in challenging radio environments and interference between licensed and unlicensed spectrum. Within these topics a contribution was made to improve the current status by analysing the unaddressed issues and coming up with potential improvements that were tested in trials or lab environment. The main contributions from the study have been: 1. A patent in the wireless security domain that reuses the fact that overlapping coverage is and will be available and protects against man in the middle attacks (Section 5.3). 2. A patent in the content distribution domain that manages to reduce the cost to deliver content within a mobile network by looking for the shortest path to the requested content (Section 6.3). 3. Improvements and interoperability test of 802.21 standard which improves the seamlessness of handovers (Section 4.2). 4. 2 infill trials which focus on how to improve the user experience in those challenging conditions (Sections 7.2 and 7.3). 5. An interference study with Wi-Fi 2.4GHz for the newly allocated spectrum for 4G (Section 8.2). This thesis demonstrates some of the improvements required in current wireless networks to evolve towards 5G and achieve the coverage, service, user experience, latency and security requirements expected from the next generation mobile technology
    corecore