87 research outputs found

    Mitigating the Signaling Resources Expended in 5G Location Management Procedures at Millimeter-Wave Frequencies

    Get PDF
    The signaling resources expended and the power consumed by User Equipments (UEs) in the Location Management (LM) procedures are expected to be higher in Fifth Generation (5G) than in legacy wireless communications networks. To mitigate this challenge, this work proposes a hybrid scheme that mitigates the signaling resources expended in paging and RAN-based Notification Area Update (RNAU) procedures in 5G. The approach utilizes a hybrid scheme that embeds a UE Identifier (UEID) partitioning scheme that directional pages UEs into a gNB-based UE Mobility Tracking (UEMT) scheme. The approach configures a gNB in an RRC_Inactive state to beam sweep a UEs last registered cell area before directionally paging the UE. The approach proposed in this work is implemented on a modified network architecture to reduce the signaling resources expended on both paging and RNAU of UEs at higher frequencies which is an enabling factor for mmWave systems. Simulation results of the total accumu- lated cost of paging showed a 65.13 % and 8.69 % reduction in signaling resources expended against the conventional approach and the existing gNB-based UEMT approach, respectively. Additionally, the total accumulated resources expended in both procedures over 24 hours showed that the modified gNB-based UEMT scheme outperformed the conventional scheme and the gNB-based UEMT scheme by 90.96 % and 38.36 %, respectively

    User-oriented mobility management in cellular wireless networks

    Get PDF
    2020 Spring.Includes bibliographical references.Mobility Management (MM) in wireless mobile networks is a vital process to keep an individual User Equipment (UE) connected while moving within the network coverage area—this is required to keep the network informed about the UE's mobility (i.e., location changes). The network must identify the exact serving cell of a specific UE for the purpose of data-packet delivery. The two MM procedures that are necessary to localize a specific UE and deliver data packets to that UE are known as Tracking Area Update (TAU) and Paging, which are burdensome not only to the network resources but also UE's battery—the UE and network always initiate the TAU and Paging, respectively. These two procedures are used in current Long Term Evolution (LTE) and its next generation (5G) networks despite the drawback that it consumes bandwidth and energy. Because of potentially very high-volume traffic and increasing density of high-mobility UEs, the TAU/Paging procedure incurs significant costs in terms of the signaling overhead and the power consumption in the battery-limited UE. This problem will become even worse in 5G, which is expected to accommodate exceptional services, such as supporting mission-critical systems (close-to-zero latency) and extending battery lifetime (10 times longer). This dissertation examines and discusses a variety of solution schemes for both the TAU and Paging, emphasizing a new key design to accommodate 5G use cases. However, ongoing efforts are still developing new schemes to provide seamless connections to the ever-increasing density of high-mobility UEs. In this context and toward achieving 5G use cases, we propose a novel solution to solve the MM issues, named gNB-based UE Mobility Tracking (gNB-based UeMT). This solution has four features aligned with achieving 5G goals. First, the mobile UE will no longer trigger the TAU to report their location changes, giving much more power savings with no signaling overhead. Instead, second, the network elements, gNBs, take over the responsibility of Tracking and Locating these UE, giving always-known UE locations. Third, our Paging procedure is markedly improved over the conventional one, providing very fast UE reachability with no Paging messages being sent simultaneously. Fourth, our solution guarantees lightweight signaling overhead with very low Paging delay; our simulation studies show that it achieves about 92% reduction in the corresponding signaling overhead. To realize these four features, this solution adds no implementation complexity. Instead, it exploits the already existing LTE/5G communication protocols, functions, and measurement reports. Our gNB-based UeMT solution by design has the potential to deal with mission-critical applications. In this context, we introduce a new approach for mission-critical and public-safety communications. Our approach aims at emergency situations (e.g., natural disasters) in which the mobile wireless network becomes dysfunctional, partially or completely. Specifically, this approach is intended to provide swift network recovery for Search-and-Rescue Operations (SAROs) to search for survivors after large-scale disasters, which we call UE-based SAROs. These SAROs are based on the fact that increasingly almost everyone carries wireless mobile devices (UEs), which serve as human-based wireless sensors on the ground. Our UE-based SAROs are aimed at accounting for limited UE battery power while providing critical information to first responders, as follows: 1) generate immediate crisis maps for the disaster-impacted areas, 2) provide vital information about where the majority of survivors are clustered/crowded, and 3) prioritize the impacted areas to identify regions that urgently need communication coverage. UE-based SAROs offer first responders a vital tool to prioritize and manage SAROs efficiently and effectively in a timely manner

    View on 5G Architecture: Version 2.0

    Get PDF
    The 5G Architecture Working Group as part of the 5GPPP Initiative is looking at capturing novel trends and key technological enablers for the realization of the 5G architecture. It also targets at presenting in a harmonized way the architectural concepts developed in various projects and initiatives (not limited to 5GPPP projects only) so as to provide a consolidated view on the technical directions for the architecture design in the 5G era. The first version of the white paper was released in July 2016, which captured novel trends and key technological enablers for the realization of the 5G architecture vision along with harmonized architectural concepts from 5GPPP Phase 1 projects and initiatives. Capitalizing on the architectural vision and framework set by the first version of the white paper, this Version 2.0 of the white paper presents the latest findings and analyses with a particular focus on the concept evaluations, and accordingly it presents the consolidated overall architecture design

    Building upon NB-IoT networks : a roadmap towards 5G new radio networks

    Get PDF
    Narrowband Internet of Things (NB-IoT) is a type of low-power wide-area (LPWA) technology standardized by the 3rd-Generation Partnership Project (3GPP) and based on long-term evolution (LTE) functionalities. NB-IoT has attracted significant interest from the research community due to its support for massive machine-type communication (mMTC) and various IoT use cases that have stringent specifications in terms of connectivity, energy efficiency, reachability, reliability, and latency. However, as the capacity requirements for different IoT use cases continue to grow, the various functionalities of the LTE evolved packet core (EPC) system may become overladen and inevitably suboptimal. Several research efforts are ongoing to meet these challenges; consequently, we present an overview of these efforts, mainly focusing on the Open System Interconnection (OSI) layer of the NB-IoT framework. We present an optimized architecture of the LTE EPC functionalities, as well as further discussion about the 3GPP NB-IoT standardization and its releases. Furthermore, the possible 5G architectural design for NB-IoT integration, the enabling technologies required for 5G NB-IoT, the 5G NR coexistence with NB-IoT, and the potential architectural deployment schemes of NB-IoT with cellular networks are introduced. In this article, a description of cloud-assisted relay with backscatter communication, a comprehensive review of the technical performance properties and channel communication characteristics from the perspective of the physical (PHY) and medium-access control (MAC) layer of NB-IoT, with a focus on 5G, are presented. The different limitations associated with simulating these systems are also discussed. The enabling market for NB-IoT, the benefits for a few use cases, and possible critical challenges related to their deployment are also included. Finally, present challenges and open research directions on the PHY and MAC properties, as well as the strengths, weaknesses, opportunities, and threats (SWOT) analysis of NB-IoT, are presented to foster the prospective research activities.http://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6287639pm2021Electrical, Electronic and Computer Engineerin

    D2.2 Draft Overall 5G RAN Design

    Full text link
    This deliverable provides the consolidated preliminary view of the METIS-II partners on the 5 th generation (5G) radio access network (RAN) design at a mid-point of the project. The overall 5G RAN is envisaged to operate over a wide range of spectrum bands comprising of heterogeneous spectrum usage scenarios. More precisely, the 5G air interface (AI) is expected to be composed of multiple so-called AI variants (AIVs), which include evolved legacy technology such as Long Term Evolution Advanced (LTE-A) as well as novel AIVs, which may be tailored to particular services or frequency bands.Arnold, P.; Bayer, N.; Belschner, J.; Rosowski, T.; Zimmermann, G.; Ericson, M.; Da Silva, IL.... (2016). D2.2 Draft Overall 5G RAN Design. https://doi.org/10.13140/RG.2.2.17831.1424

    Improved handover decision scheme for 5g mm-wave communication: optimum base station selection using machine learning approach.

    Get PDF
    A Thesis Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Information and Communication Science and Engineering of the Nelson Mandela African Institution of Science and TechnologyThe rapid growth in mobile and wireless devices has led to an exponential demand for data traf fic and exacerbated the burden on conventional wireless networks. Fifth generation (5G) and beyond networks are expected to not only accommodate this growth in data demand but also provide additional services beyond the capability of existing wireless networks, while main taining a high quality-of-experience (QoE) for users. The need for several orders of magnitude increase in system capacity has necessitated the use of millimetre wave (mm-wave) frequencies as well as the proliferation of low-power small cells overlaying the existing macro-cell layer. These approaches offer a potential increase in throughput in magnitudes of several gigabits per second and a reduction in transmission latency, but they also present new challenges. For exam ple, mm-wave frequencies have higher propagation losses and a limited coverage area, thereby escalating mobility challenges such as more frequent handovers (HOs). In addition, the ad vent of low-power small cells with smaller footprints also causes signal fluctuations across the network, resulting in repeated HOs (ping-pong) from one small cell (SC) to another. Therefore, efficient HO management is very critical in future cellular networks since frequent HOs pose multiple threats to the quality-of-service (QoS), such as a reduction in the system throughput as well as service interruptions, which results in a poor QoE for the user. How ever, HO management is a significant challenge in 5G networks due to the use of mm-wave frequencies which have much smaller footprints. To address these challenges, this work in vestigates the HO performance of 5G mm-wave networks and proposes a novel method for achieving seamless user mobility in dense networks. The proposed model is based on a double deep reinforcement learning (DDRL) algorithm. To test the performance of the model, a com parative study was made between the proposed approach and benchmark solutions, including a benchmark developed as part of this thesis. The evaluation metrics considered include system throughput, execution time, ping-pong, and the scalability of the solutions. The results reveal that the developed DDRL-based solution vastly outperforms not only conventional methods but also other machine-learning-based benchmark techniques. The main contribution of this thesis is to provide an intelligent framework for mobility man agement in the connected state (i.e HO management) in 5G. Though primarily developed for mm-wave links between UEs and BSs in ultra-dense heterogeneous networks (UDHNs), the proposed framework can also be applied to sub-6 GHz frequencies

    A survey of machine learning applications to handover management in 5G and beyond

    Get PDF
    Handover (HO) is one of the key aspects of next-generation (NG) cellular communication networks that need to be properly managed since it poses multiple threats to quality-of-service (QoS) such as the reduction in the average throughput as well as service interruptions. With the introduction of new enablers for fifth-generation (5G) networks, such as millimetre wave (mm-wave) communications, network densification, Internet of things (IoT), etc., HO management is provisioned to be more challenging as the number of base stations (BSs) per unit area, and the number of connections has been dramatically rising. Considering the stringent requirements that have been newly released in the standards of 5G networks, the level of the challenge is multiplied. To this end, intelligent HO management schemes have been proposed and tested in the literature, paving the way for tackling these challenges more efficiently and effectively. In this survey, we aim at revealing the current status of cellular networks and discussing mobility and HO management in 5G alongside the general characteristics of 5G networks. We provide an extensive tutorial on HO management in 5G networks accompanied by a discussion on machine learning (ML) applications to HO management. A novel taxonomy in terms of the source of data to be utilized in training ML algorithms is produced, where two broad categories are considered; namely, visual data and network data. The state-of-the-art on ML-aided HO management in cellular networks under each category is extensively reviewed with the most recent studies, and the challenges, as well as future research directions, are detailed

    AN EFFICIENT INTERFERENCE AVOIDANCE SCHEME FOR DEVICE-TODEVICE ENABLED FIFTH GENERATION NARROWBAND INTERNET OF THINGS NETWOKS’

    Get PDF
    Narrowband Internet of Things (NB-IoT) is a low-power wide-area (LPWA) technology built on long-term evolution (LTE) functionalities and standardized by the 3rd-Generation Partnership Project (3GPP). Due to its support for massive machine-type communication (mMTC) and different IoT use cases with rigorous standards in terms of connection, energy efficiency, reachability, reliability, and latency, NB-IoT has attracted the research community. However, as the capacity needs for various IoT use cases expand, the LTE evolved packet core (EPC) system's numerous functionalities may become overburdened and suboptimal. Several research efforts are currently in progress to address these challenges. As a result, an overview of these efforts with a specific focus on the optimized architecture of the LTE EPC functionalities, the 5G architectural design for NB-IoT integration, the enabling technologies necessary for 5G NB-IoT, 5G new radio (NR) coexistence with NB-IoT, and feasible architectural deployment schemes of NB-IoT with cellular networks is discussed. This thesis also presents cloud-assisted relay with backscatter communication as part of a detailed study of the technical performance attributes and channel communication characteristics from the physical (PHY) and medium access control (MAC) layers of the NB-IoT, with a focus on 5G. The numerous drawbacks that come with simulating these systems are explored. The enabling market for NB-IoT, the benefits for a few use cases, and the potential critical challenges associated with their deployment are all highlighted. Fortunately, the cyclic prefix orthogonal frequency division multiplexing (CPOFDM) based waveform by 3GPP NR for improved mobile broadband (eMBB) services does not prohibit the use of other waveforms in other services, such as the NB-IoT service for mMTC. As a result, the coexistence of 5G NR and NB-IoT must be manageably orthogonal (or quasi-orthogonal) to minimize mutual interference that limits the form of freedom in the waveform's overall design. As a result, 5G coexistence with NB-IoT will introduce a new interference challenge, distinct from that of the legacy network, even though the NR's coexistence with NB-IoT is believed to improve network capacity and expand the coverage of the user data rate, as well as improves robust communication through frequency reuse. Interference challenges may make channel estimation difficult for NB-IoT devices, limiting the user performance and spectral efficiency. Various existing interference mitigation solutions either add to the network's overhead, computational complexity and delay or are hampered by low data rate and coverage. These algorithms are unsuitable for an NB-IoT network owing to the low-complexity nature. As a result, a D2D communication based interference-control technique becomes an effective strategy for addressing this problem. This thesis used D2D communication to decrease the network bottleneck in dense 5G NBIoT networks prone to interference. For D2D-enabled 5G NB-IoT systems, the thesis presents an interference-avoidance resource allocation that considers the less favourable cell edge NUEs. To simplify the algorithm's computing complexity and reduce interference power, the system divides the optimization problem into three sub-problems. First, in an orthogonal deployment technique using channel state information (CSI), the channel gain factor is leveraged by selecting a probable reuse channel with higher QoS control. Second, a bisection search approach is used to find the best power control that maximizes the network sum rate, and third, the Hungarian algorithm is used to build a maximum bipartite matching strategy to choose the optimal pairing pattern between the sets of NUEs and the D2D pairs. The proposed approach improves the D2D sum rate and overall network SINR of the 5G NB-IoT system, according to the numerical data. The maximum power constraint of the D2D pair, D2D's location, Pico-base station (PBS) cell radius, number of potential reuse channels, and cluster distance impact the D2D pair's performance. The simulation results achieve 28.35%, 31.33%, and 39% SINR performance higher than the ARSAD, DCORA, and RRA algorithms when the number of NUEs is twice the number of D2D pairs, and 2.52%, 14.80%, and 39.89% SINR performance higher than the ARSAD, RRA, and DCORA when the number of NUEs and D2D pairs are equal. As a result, a D2D sum rate increase of 9.23%, 11.26%, and 13.92% higher than the ARSAD, DCORA, and RRA when the NUE’s number is twice the number of D2D pairs, and a D2D’s sum rate increase of 1.18%, 4.64% and 15.93% higher than the ARSAD, RRA and DCORA respectively, with an equal number of NUEs and D2D pairs is achieved. The results demonstrate the efficacy of the proposed scheme. The thesis also addressed the problem where the cell-edge NUE's QoS is critical to challenges such as long-distance transmission, delays, low bandwidth utilization, and high system overhead that affect 5G NB-IoT network performance. In this case, most cell-edge NUEs boost their transmit power to maximize network throughput. Integrating cooperating D2D relaying technique into 5G NB-IoT heterogeneous network (HetNet) uplink spectrum sharing increases the system's spectral efficiency and interference power, further degrading the network. Using a max-max SINR (Max-SINR) approach, this thesis proposed an interference-aware D2D relaying strategy for 5G NB-IoT QoS improvement for a cell-edge NUE to achieve optimum system performance. The Lagrangian-dual technique is used to optimize the transmit power of the cell-edge NUE to the relay based on the average interference power constraint, while the relay to the NB-IoT base station (NBS) employs a fixed transmit power. To choose an optimal D2D relay node, the channel-to-interference plus noise ratio (CINR) of all available D2D relays is used to maximize the minimum cell-edge NUE's data rate while ensuring the cellular NUEs' QoS requirements are satisfied. Best harmonic mean, best-worst, half-duplex relay selection, and a D2D communication scheme were among the other relaying selection strategies studied. The simulation results reveal that the Max-SINR selection scheme outperforms all other selection schemes due to the high channel gain between the two communication devices except for the D2D communication scheme. The proposed algorithm achieves 21.27% SINR performance, which is nearly identical to the half-duplex scheme, but outperforms the best-worst and harmonic selection techniques by 81.27% and 40.29%, respectively. As a result, as the number of D2D relays increases, the capacity increases by 14.10% and 47.19%, respectively, over harmonic and half-duplex techniques. Finally, the thesis presents future research works on interference control in addition with the open research directions on PHY and MAC properties and a SWOT (Strengths, Weaknesses, Opportunities, and Threats) analysis presented in Chapter 2 to encourage further study on 5G NB-IoT
    • …
    corecore