120 research outputs found

    A use case of low power wide area networks in future 5G healthcare applications

    Get PDF
    Abstract. The trend in all cellular evolution to the Long-Term Evolution (LTE) has always been to offer users continuously increasing data rates. However, the next leap forwards towards the 5th Generation Mobile Networks (5G) will be mainly addressing the needs of devices. Machines communicating with each other, sensors reporting to a server, or even machines communicating with humans, these are all different aspects of the same technology; the Internet of Things (IoT). The key differentiator between Machine-to-Machine (M2M) communications and IoT will be the added -feature of connecting devices and sensors not only to themselves, but also to the internet. The appropriate communications network is the key to allow this connectivity. Local Area Networks (LANs) and Wide Area Networks (WANs) have been thought of as enablers for IoT, but since they both suffered from limitations in IoT aspects, the need for a new enabling technology was evident. LPWANs are networks dedicated to catering for the needs of IoT such as providing low energy consumption for wireless devices. LPWANs can be categorized into proprietary LPWANs and cellular LPWANs. Proprietary LPWANs are created by an alliance of companies working together on creating a communications standard operating in unlicensed frequency bands. An example of proprietary LPWANs is LoRa. Whereas cellular LPWANs are standardized by the 3rd Partnership Project (3GPP) and they are basically versions of the LTE standard especially designed for machine communications. An example of cellular LPWANs is Narrowband IoT (NB IoT). This diploma thesis documents the usage of LoRa and NB IoT in a healthcare use case of IoT. It describes the steps and challenges of deploying an LTE network at a target site, which will be used by the LoRa and NB IoT sensors to transmit data through the 5G test network (5GTN) to a desired server location for storing and later analysis.Matalan tehonkulutuksen ja pitkänkantaman teknologian käyttötapaus tulevaisuuden 5G:tä hyödyntävissä terveydenhoidon sovelluksissa. Tiivistelmä. Pitemmän aikavälin tarkastelussa matkaviestintäteknologian kehittyminen nykyisin käytössä olevaan Long-Term Evolution (LTE) teknologiaan on tarkoittanut käyttäjille yhä suurempia datanopeuksia. Seuraavassa askeleessa kohti 5. sukupolven matkaviestintäverkkoja (5G) lähestytään kehitystä myös laitteiden tarpeiden lähtökohdista. Toistensa kanssa kommunikoivat koneet, palvelimille dataa lähettävät anturit tai jopa ihmisten kanssa kommunikoivat koneet ovat kaikki eri puolia samasta teknologisesta käsitteestä; esineiden internetistä (IoT). Oleellisin ero koneiden välisessä kommunikoinnissa (M2M) ja IoT:ssä on, että erinäiset laitteet tulevat olemaan yhdistettyinä paitsi toisiinsa myös internettiin. Tätä kytkentäisyyttä varten tarvitaan tarkoitukseen kehitetty matkaviestinverkko. Sekä lähiverkkoja (LAN) että suuralueverkkoja (WAN) on pidetty mahdollisina IoT mahdollistajina, mutta näiden molempien käsitteiden alle kuuluvissa teknologioissa on rajoitteita IoT:n vaatimusten lähtökohdista, joten uuden teknologian kehittäminen oli tarpeellista. Matalan tehonkulutuksen suuralueverkko (LP-WAN) on käsite, johon luokitellaan eri teknologioita, joita on kehitetty erityisesti IoT:n tarpeista lähtien. LP-WAN voidaan jaotella ainakin itse kehitettyihin ja matkaviestinverkkoihin perustuviin teknologisiin ratkaisuihin. Itse kehitetyt ratkaisut on luotu lukuisten yritysten yhteenliittymissä eli alliansseissa ja nämä ratkaisut keskittyvät lisensoimattomilla taajuuksilla toimiviin langattomiin ratkaisuihin, joista esimerkkinä laajasti käytössä oleva LoRa. Matkaviestinverkkoihin perustuvat lisensoiduilla taajuuksilla toimivat ratkaisut on puolestaan erikseen standardoitu 3GPP-nimisessä yhteenliittymässä, joka nykyisellään vastaa 2G, 3G ja LTE:n standardoiduista päätöksistä. Esimerkki 3GPP:n alaisesta LPWAN-luokkaan kuuluvasta teknologiasta on kapea kaistainen IoT-teknologia, NB-IoT. Tässä diplomityössä keskitytään terveydenhoidon käyttötapaukseen, missä antureiden mittaamaa tietoa siirretään langattomasti käyttäen sekä LoRa että NB-IoT teknologioita. Työssä kuvataan eri vaiheet ja haasteet, joita liittyi kun rakennetaan erikseen tiettyyn kohteeseen LTE-verkon radiopeitto, jotta LoRa:a ja NB-IoT:a käyttävät anturit saadaan välittämään mitattua dataa halutulle palvelimelle säilytykseen ja myöhempää analysointia varten. LTE-radiopeiton rakensi Oulun yliopiston omistama 5G testiverkko, jonka tarkoitus on tukea sekä tutkimusta että ympäröivää ekosysteemiä tulevaisuuden 5G:n kehityksessä

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial

    Radio resource allocation for overlay D2D-based vehicular communications in future wireless networks

    Get PDF
    Mobilfunknetze der nächsten Generation ermöglichen einen weitverbreiteten Einsatz von Device-to-Device Kommunikation, der direkten Kommunikation zwischen zellularen Endgeräten. Für viele Anwendungsfälle zur direkten Kommunikation zwischen Endgeräten sind eine deterministische Latenz und die hohe Zuverlässigkeit von zentraler Bedeutung. Dienste zur direkten Kommunikation (D2D) für in der Nähe befindliche Endgeräte sind vielversprechend die hohen Anforderungen an Latenz und Zuverlässigkeit für zukünftige vertikale Anwendungen zu erfüllen. Eine der herausragenden vertikalen Anwendungen ist die Fahrzeugkommunikation, bei der die Fahrzeuge sicherheitskritische Meldungen direkt über D2D-Kommunikation austauschen, die dadurch zur Reduktion von Verkehrsunfällen und gleichzeitig von Todesfällen im Straßenverkehrt beiträgt. Neue Techniken zur effizienteren Zuweisung von Funkressourcen in der D2D-Kommunikation haben in letzter Zeit in Industrie und Wissenschaft große Aufmerksamkeit erlangt. Zusätzlich zur Allokation von Ressourcen, wird die Energieeffizienz zunehmend wichtiger, die normalerweise im Zusammenhang mit der Ressourcenallokation behandelt wird. Diese Dissertation untersucht verschiedener Ansätze der Funkressourcenzuweisung und Energieeffizienztechniken in der LTE und NR V2X Kommunikation. Im Folgenden beschreiben wir kurz die Kernideen der Dissertation. Meist zeichnen sich D2D-Anwendungen durch ein relativ geringes Datenvolumen aus, die über Funkressourcen übertragen werden. In LTE können diese Funkressourcen aufgrund der groben Granularität für die Ressourcenzuweisung nicht effizient genutzt werden. Insbesondere beim semi-persistenten Scheduling, bei dem eine Funkressource über einen längeren Zeitraum im Overlay D2D festgelegt wird, sind die Funkressourcen für solche Anwendungen nicht ausgelastet. Um dieses Problem zu lösen, wird eine hierarchische Form für das Management der Funkressourcen, ein sogenanntes Subgranting-Schema, vorgeschlagen. Dabei kann ein nahegelegener zellularer Nutzer, der sogenannte begünstigte Nutzer, ungenutzten Funkressourcen, die durch Subgranting-Signalisierung angezeigt werden, wiederzuverwenden. Das vorgeschlagene Schema wird bewertet und mit "shortening TTI", einen Schema mit reduzierten Sendezeitintervallen, in Bezug auf den Zellendurchsatz verglichen. Als nächster Schritt wird untersucht, wie der begünstigten Benutzer ausgewählt werden kann und als Maximierungsproblem des Zellendurchsatzes im Uplink unter Berücksichtigung von Zuverlässigkeits- und Latenzanforderungen dargestellt. Dafür wird ein heuristischer zentralisierter, d.h. dedizierter Sub-Granting-Radio-Ressource DSGRR-Algorithmus vorgeschlagen. Die Simulationsergebnisse und die Analyse ergeben in einem Szenario mit stationären Nutzern eine Erhöhung des Zelldurchsatzes bei dem Einsatz des vorgeschlagenen DSGRR-Algorithmus im Vergleich zu einer zufälligen Auswahl von Nutzern. Zusätzlich wird das Problem der Auswahl des begünstigten Nutzers in einem dynamischen Szenario untersucht, in dem sich alle Nutzer bewegen. Wir bewerten den durch das Sub-Granting durch die Mobilität entstandenen Signalisierungs-Overhead im DSGRR. Anschließend wird ein verteilter Heuristik-Algorithmus (OSGRR) vorgeschlagen und sowohl mit den Ergebnissen des DSGRR-Algorithmus als auch mit den Ergebnissen ohne Sub-Granting verglichen. Die Simulationsergebnisse zeigen einen verbesserten Zellendurchsatz für den OSGRR im Vergleich zu den anderen Algorithmen. Außerdem ist zu beobachten, dass der durch den OSGRR entstehende Overhead geringer ist als der durch den DSGRR, während der erreichte Zellendurchsatz nahe am maximal erreichbaren Uplink-Zellendurchsatz liegt. Zusätzlich wird die Ressourcenallokation im Zusammenhang mit der Energieeffizienz bei autonomer Ressourcenauswahl in New Radio (NR) Mode 2 untersucht. Die autonome Auswahl der Ressourcen wird als Verhältnis von Summenrate und Energieverbrauch formuliert. Das Ziel ist den Stromverbrauch der akkubetriebenen Endgeräte unter Berücksichtigung der geforderten Zuverlässigkeit und Latenz zu minimieren. Der heuristische Algorithmus "Density of Traffic-based Resource Allocation (DeTRA)" wird als Lösung vorgeschlagen. Bei dem vorgeschlagenen Algorithmus wird der Ressourcenpool in Abhängigkeit von der Verkehrsdichte pro Verkehrsart aufgeteilt. Die zufällige Auswahl erfolgt zwingend auf dem dedizierten Ressourcenpool beim Eintreffen aperiodischer Daten. Die Simulationsergebnisse zeigen, dass der vorgeschlagene Algorithmus die gleichen Ergebnisse für die Paketempfangsrate (PRR) erreicht, wie der sensing-basierte Algorithmus. Zusätzlich wird der Stromverbrauch des Endgeräts reduziert und damit die Energieeffizienz durch die Anwendung des DeTRA-Algorithmus verbessert. In dieser Arbeit werden Techniken zur Allokation von Funkressourcen in der LTE-basierten D2D-Kommunikation erforscht und eingesetzt, mit dem Ziel Funkressourcen effizienter zu nutzen. Darüber hinaus ist der in dieser Arbeit vorgestellte Ansatz eine Basis für zukünftige Untersuchungen, wie akkubasierte Endgeräte mit minimalem Stromverbrauch in der NR-V2X-Kommunikation Funkressourcen optimal auswählen können.Next-generation cellular networks are envisioned to enable widely Device-to-Device (D2D) communication. For many applications in the D2D domain, deterministic communication latency and high reliability are of exceptionally high importance. The proximity service provided by D2D communication is a promising feature that can fulfil the reliability and latency requirements of emerging vertical applications. One of the prominent vertical applications is vehicular communication, in which the vehicles disseminate safety messages directly through D2D communication, resulting in the fatality rate reduction due to a possible collision. Radio resource allocation techniques in D2D communication have recently gained much attention in industry and academia, through which valuable radio resources are allocated more efficiently. In addition to the resource allocation techniques, energy sustainability is highly important and is usually considered in conjunction with the resource allocation approach. This dissertation is dedicated to studying different avenues of the radio resource allocation and energy efficiency techniques in Long Term Evolution (LTE) and New Radio (NR) Vehicle-to-Everythings (V2X) communications. In the following, we briefly describe the core ideas in this study. Mostly, the D2D applications are characterized by relatively small traffic payload size, and in LTE, due to coarse granularity of the subframe, the radio resources can not be utilized efficiently. Particularly, in the case of semi-persistent scheduling when a radio resource is scheduled for a longer time in the overlay D2D, the radio resources are underutilized for such applications. To address this problem, a hierarchical radio resource management scheme, i.e., a sub-granting scheme, is proposed by which nearby cellular users, i.e., beneficiary users, are allowed to reuse the unused radio resource indicated by sub-granting signaling. The proposed scheme is evaluated and compared with shortening Transmission Time Interval (TTI) schemes in terms of cell throughput. Then, the beneficiary user selection problem is investigated and is cast as a maximization problem of uplink cell throughput subject to reliability and latency requirements. A heuristic centralized, i.e., dedicated sub-granting radio resource Dedicated Sub-Granting Radio Resource (DSGRR) algorithm is proposed to address the original beneficiary user selection problem. The simulation results and analysis show the superiority of the proposed DSGRR algorithm over the random beneficiary user selection algorithm in terms of the cell throughput in a scenario with stationary users. Further, the beneficiary user selection problem is investigated in a scenario where all users are moving in a dynamic environment. We evaluate the sub-granting signaling overhead due to mobility in the DSGRR, and then a distributed heuristics algorithm, i.e., Open Sub-Granting Radio Resource (OSGRR), is proposed and compared with the DSGRR algorithm and no sub-granting case. Simulation results show improved cell throughput for the OSGRR compared with other algorithms. Besides, it is observed that the overhead incurred by the OSGRR is less than the DSGRR while the achieved cell throughput is yet close to the maximum achievable uplink cell throughput. Also, joint resource allocation and energy efficiency in autonomous resource selection in NR, i.e. Mode 2, is examined. The autonomous resource selection is formulated as a ratio of sum-rate and energy consumption. The objective is to minimize the energy efficiency of the power-saving users subject to reliability and latency requirements. A heuristic algorithm, density of traffic-based resource allocation (DeTRA), is proposed to solve the problem. The proposed algorithm splits the resource pool based on the traffic density per traffic type. The random selection is then mandated to be performed on the dedicated resource pool upon arrival of the aperiodic traffic is triggered. The simulation results show that the proposed algorithm achieves the same packet reception ratio (PRR) value as the sensing-based algorithm. In addition, per-user power consumption is reduced, and consequently, the energy efficiency is improved by applying the DeTRA algorithm. The research in this study leverages radio resource allocation techniques in LTE based D2D communications to be utilized radio resources more efficiently. In addition, the conducted research paves a way to study further how the power-saving users would optimally select the radio resources with minimum energy consumption in NR V2X communications

    Internet of Things and Sensors Networks in 5G Wireless Communications

    Get PDF
    This book is a printed edition of the Special Issue Internet of Things and Sensors Networks in 5G Wireless Communications that was published in Sensors

    Internet of Things and Sensors Networks in 5G Wireless Communications

    Get PDF
    This book is a printed edition of the Special Issue Internet of Things and Sensors Networks in 5G Wireless Communications that was published in Sensors

    Internet of Things and Sensors Networks in 5G Wireless Communications

    Get PDF
    The Internet of Things (IoT) has attracted much attention from society, industry and academia as a promising technology that can enhance day to day activities, and the creation of new business models, products and services, and serve as a broad source of research topics and ideas. A future digital society is envisioned, composed of numerous wireless connected sensors and devices. Driven by huge demand, the massive IoT (mIoT) or massive machine type communication (mMTC) has been identified as one of the three main communication scenarios for 5G. In addition to connectivity, computing and storage and data management are also long-standing issues for low-cost devices and sensors. The book is a collection of outstanding technical research and industrial papers covering new research results, with a wide range of features within the 5G-and-beyond framework. It provides a range of discussions of the major research challenges and achievements within this topic

    Ultra reliable low latency communication in MTC network

    Get PDF
    Abstract. Internet of things is in progress to build the smart society, and wireless networks are critical enablers for many of its use cases. In this thesis, we present some of the vital concept of diversity and multi-connectivity to achieve ultra-reliability and low latency for machine type wireless communication networks. Diversity is one of the critical factors to deal with fading channel impairments, which in term is a crucial factor to achieve targeted outage probabilities and try to reach out such requirement of five 9’s as defined by some standardization bodies. We evaluate an interference-limited network composed of multiple remote radio heads connected to the user equipment. Some of those links are allowed to cooperate, thus reducing interference, or to perform more elaborated strategies such as selection combining or maximal ratio combining. Therefore, we derive their respective closed-form analytical solutions for respective outage probabilities. We provide extensive numerical analysis and discuss the gains of cooperation and multi-connectivity enabled to be a centralized radio access network

    Building upon NB-IoT networks : a roadmap towards 5G new radio networks

    Get PDF
    Narrowband Internet of Things (NB-IoT) is a type of low-power wide-area (LPWA) technology standardized by the 3rd-Generation Partnership Project (3GPP) and based on long-term evolution (LTE) functionalities. NB-IoT has attracted significant interest from the research community due to its support for massive machine-type communication (mMTC) and various IoT use cases that have stringent specifications in terms of connectivity, energy efficiency, reachability, reliability, and latency. However, as the capacity requirements for different IoT use cases continue to grow, the various functionalities of the LTE evolved packet core (EPC) system may become overladen and inevitably suboptimal. Several research efforts are ongoing to meet these challenges; consequently, we present an overview of these efforts, mainly focusing on the Open System Interconnection (OSI) layer of the NB-IoT framework. We present an optimized architecture of the LTE EPC functionalities, as well as further discussion about the 3GPP NB-IoT standardization and its releases. Furthermore, the possible 5G architectural design for NB-IoT integration, the enabling technologies required for 5G NB-IoT, the 5G NR coexistence with NB-IoT, and the potential architectural deployment schemes of NB-IoT with cellular networks are introduced. In this article, a description of cloud-assisted relay with backscatter communication, a comprehensive review of the technical performance properties and channel communication characteristics from the perspective of the physical (PHY) and medium-access control (MAC) layer of NB-IoT, with a focus on 5G, are presented. The different limitations associated with simulating these systems are also discussed. The enabling market for NB-IoT, the benefits for a few use cases, and possible critical challenges related to their deployment are also included. Finally, present challenges and open research directions on the PHY and MAC properties, as well as the strengths, weaknesses, opportunities, and threats (SWOT) analysis of NB-IoT, are presented to foster the prospective research activities.http://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6287639pm2021Electrical, Electronic and Computer Engineerin

    Enabling Technologies for Internet of Things: Licensed and Unlicensed Techniques

    Get PDF
    The Internet of Things (IoT) is a novel paradigm which is shaping the evolution of the future Internet. According to the vision underlying the IoT, the next step in increasing the ubiquity of the Internet, after connecting people anytime and everywhere, is to connect inanimate objects. By providing objects with embedded communication capabilities and a common addressing scheme, a highly distributed and ubiquitous network of seamlessly connected heterogeneous devices is formed, which can be fully integrated into the current Internet and mobile networks, thus allowing for the development of new intelligent services available anytime, anywhere, by anyone and anything. Such a vision is also becoming known under the name of Machine-to-Machine (M2M), where the absence of human interaction in the system dynamics is further emphasized. A massive number of wireless devices will have the ability to connect to the Internat through the IoT framework. With the accelerating pace of marketing such framework, the new wireless communications standards are studying/proposing solutions to incorporate the services needed for the IoT. However, with an estimate of 30 billion connected devices, a lot of challenges are facing the current wireless technology. In our research, we address a variety of technology candidates for enabling such a massive framework. Mainly, we focus on the nderlay cognitive radio networks as the unlicensed candidate for IoT. On the other hand, we look into the current efforts done by the standardization bodies to accommodate the requirements of the IoT into the current cellular networks. Specifically, we survey the new features and the new user equipment categories added to the physical layer of the LTE-A. In particular, we study the performance of a dual-hop cognitive radio network sharing the spectrum of a primary network in an underlay fashion. In particular, the cognitive network consists of a source, a destination, and multiple nodes employed as amplify-and-forward relays. To improve the spectral efficiency, all relays are allowed to instantaneously transmit to the destination over the same frequency band. We present the optimal power allocation that maximizes the received signal-to-noise ratio (SNR) at the destination while satisfying the interference constrains of the primary network. The optimal power allocation is obtained through an eigen-solution of a channel-dependent matrix, and is shown to transform the transmission over the non-orthogonal relays into parallel channels. Furthermore, while the secondary destination is equipped with multiple antennas, we propose an antenna selection scheme to select the antenna with the highest SNR. To this end, we propose a clustering scheme to subgroup the available relays and use antenna selection at the receiver to extract the same diversity order. We show that random clustering causes the system to lose some of the available degrees of freedom. We provide analytical expression of the outage probability of the system for the random clustering and the proposed maximum-SNR clustering scheme with antenna selection. In addition, we adapt our design to increase the energy-efficiency of the overall network without significant loss in the data rate. In the second part of this thesis, we will look into the current efforts done by the standardization bodies to accommodate the equirements of the IoT into the current cellular networks. Specifically, we present the new features and the new user equipment categories added to the physical layer of the LTE-A. We study some of the challenges facing the LTE-A when dealing with Machine Type communications (MTC). Specifically, the MTC Physical Downlink control channel (MPDCCH) is among the newly introduced features in the LTE-A that carries the downlink control information (DCI) for MTC devices. Correctly decoding the PDCCH, mainly depends on the channel estimation used to compensate for the channel errors during transmission, and the choice of such technique will affect both the complexity and the performance of the user equipment. We propose and assess the performance of a simple channel estimation technique depends in essence on the Least Squares (LS) estimates of the pilots signal and linear interpolations for low-Doppler channels associated with the MTC application

    AN EFFICIENT INTERFERENCE AVOIDANCE SCHEME FOR DEVICE-TODEVICE ENABLED FIFTH GENERATION NARROWBAND INTERNET OF THINGS NETWOKS’

    Get PDF
    Narrowband Internet of Things (NB-IoT) is a low-power wide-area (LPWA) technology built on long-term evolution (LTE) functionalities and standardized by the 3rd-Generation Partnership Project (3GPP). Due to its support for massive machine-type communication (mMTC) and different IoT use cases with rigorous standards in terms of connection, energy efficiency, reachability, reliability, and latency, NB-IoT has attracted the research community. However, as the capacity needs for various IoT use cases expand, the LTE evolved packet core (EPC) system's numerous functionalities may become overburdened and suboptimal. Several research efforts are currently in progress to address these challenges. As a result, an overview of these efforts with a specific focus on the optimized architecture of the LTE EPC functionalities, the 5G architectural design for NB-IoT integration, the enabling technologies necessary for 5G NB-IoT, 5G new radio (NR) coexistence with NB-IoT, and feasible architectural deployment schemes of NB-IoT with cellular networks is discussed. This thesis also presents cloud-assisted relay with backscatter communication as part of a detailed study of the technical performance attributes and channel communication characteristics from the physical (PHY) and medium access control (MAC) layers of the NB-IoT, with a focus on 5G. The numerous drawbacks that come with simulating these systems are explored. The enabling market for NB-IoT, the benefits for a few use cases, and the potential critical challenges associated with their deployment are all highlighted. Fortunately, the cyclic prefix orthogonal frequency division multiplexing (CPOFDM) based waveform by 3GPP NR for improved mobile broadband (eMBB) services does not prohibit the use of other waveforms in other services, such as the NB-IoT service for mMTC. As a result, the coexistence of 5G NR and NB-IoT must be manageably orthogonal (or quasi-orthogonal) to minimize mutual interference that limits the form of freedom in the waveform's overall design. As a result, 5G coexistence with NB-IoT will introduce a new interference challenge, distinct from that of the legacy network, even though the NR's coexistence with NB-IoT is believed to improve network capacity and expand the coverage of the user data rate, as well as improves robust communication through frequency reuse. Interference challenges may make channel estimation difficult for NB-IoT devices, limiting the user performance and spectral efficiency. Various existing interference mitigation solutions either add to the network's overhead, computational complexity and delay or are hampered by low data rate and coverage. These algorithms are unsuitable for an NB-IoT network owing to the low-complexity nature. As a result, a D2D communication based interference-control technique becomes an effective strategy for addressing this problem. This thesis used D2D communication to decrease the network bottleneck in dense 5G NBIoT networks prone to interference. For D2D-enabled 5G NB-IoT systems, the thesis presents an interference-avoidance resource allocation that considers the less favourable cell edge NUEs. To simplify the algorithm's computing complexity and reduce interference power, the system divides the optimization problem into three sub-problems. First, in an orthogonal deployment technique using channel state information (CSI), the channel gain factor is leveraged by selecting a probable reuse channel with higher QoS control. Second, a bisection search approach is used to find the best power control that maximizes the network sum rate, and third, the Hungarian algorithm is used to build a maximum bipartite matching strategy to choose the optimal pairing pattern between the sets of NUEs and the D2D pairs. The proposed approach improves the D2D sum rate and overall network SINR of the 5G NB-IoT system, according to the numerical data. The maximum power constraint of the D2D pair, D2D's location, Pico-base station (PBS) cell radius, number of potential reuse channels, and cluster distance impact the D2D pair's performance. The simulation results achieve 28.35%, 31.33%, and 39% SINR performance higher than the ARSAD, DCORA, and RRA algorithms when the number of NUEs is twice the number of D2D pairs, and 2.52%, 14.80%, and 39.89% SINR performance higher than the ARSAD, RRA, and DCORA when the number of NUEs and D2D pairs are equal. As a result, a D2D sum rate increase of 9.23%, 11.26%, and 13.92% higher than the ARSAD, DCORA, and RRA when the NUE’s number is twice the number of D2D pairs, and a D2D’s sum rate increase of 1.18%, 4.64% and 15.93% higher than the ARSAD, RRA and DCORA respectively, with an equal number of NUEs and D2D pairs is achieved. The results demonstrate the efficacy of the proposed scheme. The thesis also addressed the problem where the cell-edge NUE's QoS is critical to challenges such as long-distance transmission, delays, low bandwidth utilization, and high system overhead that affect 5G NB-IoT network performance. In this case, most cell-edge NUEs boost their transmit power to maximize network throughput. Integrating cooperating D2D relaying technique into 5G NB-IoT heterogeneous network (HetNet) uplink spectrum sharing increases the system's spectral efficiency and interference power, further degrading the network. Using a max-max SINR (Max-SINR) approach, this thesis proposed an interference-aware D2D relaying strategy for 5G NB-IoT QoS improvement for a cell-edge NUE to achieve optimum system performance. The Lagrangian-dual technique is used to optimize the transmit power of the cell-edge NUE to the relay based on the average interference power constraint, while the relay to the NB-IoT base station (NBS) employs a fixed transmit power. To choose an optimal D2D relay node, the channel-to-interference plus noise ratio (CINR) of all available D2D relays is used to maximize the minimum cell-edge NUE's data rate while ensuring the cellular NUEs' QoS requirements are satisfied. Best harmonic mean, best-worst, half-duplex relay selection, and a D2D communication scheme were among the other relaying selection strategies studied. The simulation results reveal that the Max-SINR selection scheme outperforms all other selection schemes due to the high channel gain between the two communication devices except for the D2D communication scheme. The proposed algorithm achieves 21.27% SINR performance, which is nearly identical to the half-duplex scheme, but outperforms the best-worst and harmonic selection techniques by 81.27% and 40.29%, respectively. As a result, as the number of D2D relays increases, the capacity increases by 14.10% and 47.19%, respectively, over harmonic and half-duplex techniques. Finally, the thesis presents future research works on interference control in addition with the open research directions on PHY and MAC properties and a SWOT (Strengths, Weaknesses, Opportunities, and Threats) analysis presented in Chapter 2 to encourage further study on 5G NB-IoT
    corecore