83 research outputs found

    Relaying in the Internet of Things (IoT): A Survey

    Get PDF
    The deployment of relays between Internet of Things (IoT) end devices and gateways can improve link quality. In cellular-based IoT, relays have the potential to reduce base station overload. The energy expended in single-hop long-range communication can be reduced if relays listen to transmissions of end devices and forward these observations to gateways. However, incorporating relays into IoT networks faces some challenges. IoT end devices are designed primarily for uplink communication of small-sized observations toward the network; hence, opportunistically using end devices as relays needs a redesign of both the medium access control (MAC) layer protocol of such end devices and possible addition of new communication interfaces. Additionally, the wake-up time of IoT end devices needs to be synchronized with that of the relays. For cellular-based IoT, the possibility of using infrastructure relays exists, and noncellular IoT networks can leverage the presence of mobile devices for relaying, for example, in remote healthcare. However, the latter presents problems of incentivizing relay participation and managing the mobility of relays. Furthermore, although relays can increase the lifetime of IoT networks, deploying relays implies the need for additional batteries to power them. This can erode the energy efficiency gain that relays offer. Therefore, designing relay-assisted IoT networks that provide acceptable trade-offs is key, and this goes beyond adding an extra transmit RF chain to a relay-enabled IoT end device. There has been increasing research interest in IoT relaying, as demonstrated in the available literature. Works that consider these issues are surveyed in this paper to provide insight into the state of the art, provide design insights for network designers and motivate future research directions

    AN EFFICIENT INTERFERENCE AVOIDANCE SCHEME FOR DEVICE-TODEVICE ENABLED FIFTH GENERATION NARROWBAND INTERNET OF THINGS NETWOKS’

    Get PDF
    Narrowband Internet of Things (NB-IoT) is a low-power wide-area (LPWA) technology built on long-term evolution (LTE) functionalities and standardized by the 3rd-Generation Partnership Project (3GPP). Due to its support for massive machine-type communication (mMTC) and different IoT use cases with rigorous standards in terms of connection, energy efficiency, reachability, reliability, and latency, NB-IoT has attracted the research community. However, as the capacity needs for various IoT use cases expand, the LTE evolved packet core (EPC) system's numerous functionalities may become overburdened and suboptimal. Several research efforts are currently in progress to address these challenges. As a result, an overview of these efforts with a specific focus on the optimized architecture of the LTE EPC functionalities, the 5G architectural design for NB-IoT integration, the enabling technologies necessary for 5G NB-IoT, 5G new radio (NR) coexistence with NB-IoT, and feasible architectural deployment schemes of NB-IoT with cellular networks is discussed. This thesis also presents cloud-assisted relay with backscatter communication as part of a detailed study of the technical performance attributes and channel communication characteristics from the physical (PHY) and medium access control (MAC) layers of the NB-IoT, with a focus on 5G. The numerous drawbacks that come with simulating these systems are explored. The enabling market for NB-IoT, the benefits for a few use cases, and the potential critical challenges associated with their deployment are all highlighted. Fortunately, the cyclic prefix orthogonal frequency division multiplexing (CPOFDM) based waveform by 3GPP NR for improved mobile broadband (eMBB) services does not prohibit the use of other waveforms in other services, such as the NB-IoT service for mMTC. As a result, the coexistence of 5G NR and NB-IoT must be manageably orthogonal (or quasi-orthogonal) to minimize mutual interference that limits the form of freedom in the waveform's overall design. As a result, 5G coexistence with NB-IoT will introduce a new interference challenge, distinct from that of the legacy network, even though the NR's coexistence with NB-IoT is believed to improve network capacity and expand the coverage of the user data rate, as well as improves robust communication through frequency reuse. Interference challenges may make channel estimation difficult for NB-IoT devices, limiting the user performance and spectral efficiency. Various existing interference mitigation solutions either add to the network's overhead, computational complexity and delay or are hampered by low data rate and coverage. These algorithms are unsuitable for an NB-IoT network owing to the low-complexity nature. As a result, a D2D communication based interference-control technique becomes an effective strategy for addressing this problem. This thesis used D2D communication to decrease the network bottleneck in dense 5G NBIoT networks prone to interference. For D2D-enabled 5G NB-IoT systems, the thesis presents an interference-avoidance resource allocation that considers the less favourable cell edge NUEs. To simplify the algorithm's computing complexity and reduce interference power, the system divides the optimization problem into three sub-problems. First, in an orthogonal deployment technique using channel state information (CSI), the channel gain factor is leveraged by selecting a probable reuse channel with higher QoS control. Second, a bisection search approach is used to find the best power control that maximizes the network sum rate, and third, the Hungarian algorithm is used to build a maximum bipartite matching strategy to choose the optimal pairing pattern between the sets of NUEs and the D2D pairs. The proposed approach improves the D2D sum rate and overall network SINR of the 5G NB-IoT system, according to the numerical data. The maximum power constraint of the D2D pair, D2D's location, Pico-base station (PBS) cell radius, number of potential reuse channels, and cluster distance impact the D2D pair's performance. The simulation results achieve 28.35%, 31.33%, and 39% SINR performance higher than the ARSAD, DCORA, and RRA algorithms when the number of NUEs is twice the number of D2D pairs, and 2.52%, 14.80%, and 39.89% SINR performance higher than the ARSAD, RRA, and DCORA when the number of NUEs and D2D pairs are equal. As a result, a D2D sum rate increase of 9.23%, 11.26%, and 13.92% higher than the ARSAD, DCORA, and RRA when the NUE’s number is twice the number of D2D pairs, and a D2D’s sum rate increase of 1.18%, 4.64% and 15.93% higher than the ARSAD, RRA and DCORA respectively, with an equal number of NUEs and D2D pairs is achieved. The results demonstrate the efficacy of the proposed scheme. The thesis also addressed the problem where the cell-edge NUE's QoS is critical to challenges such as long-distance transmission, delays, low bandwidth utilization, and high system overhead that affect 5G NB-IoT network performance. In this case, most cell-edge NUEs boost their transmit power to maximize network throughput. Integrating cooperating D2D relaying technique into 5G NB-IoT heterogeneous network (HetNet) uplink spectrum sharing increases the system's spectral efficiency and interference power, further degrading the network. Using a max-max SINR (Max-SINR) approach, this thesis proposed an interference-aware D2D relaying strategy for 5G NB-IoT QoS improvement for a cell-edge NUE to achieve optimum system performance. The Lagrangian-dual technique is used to optimize the transmit power of the cell-edge NUE to the relay based on the average interference power constraint, while the relay to the NB-IoT base station (NBS) employs a fixed transmit power. To choose an optimal D2D relay node, the channel-to-interference plus noise ratio (CINR) of all available D2D relays is used to maximize the minimum cell-edge NUE's data rate while ensuring the cellular NUEs' QoS requirements are satisfied. Best harmonic mean, best-worst, half-duplex relay selection, and a D2D communication scheme were among the other relaying selection strategies studied. The simulation results reveal that the Max-SINR selection scheme outperforms all other selection schemes due to the high channel gain between the two communication devices except for the D2D communication scheme. The proposed algorithm achieves 21.27% SINR performance, which is nearly identical to the half-duplex scheme, but outperforms the best-worst and harmonic selection techniques by 81.27% and 40.29%, respectively. As a result, as the number of D2D relays increases, the capacity increases by 14.10% and 47.19%, respectively, over harmonic and half-duplex techniques. Finally, the thesis presents future research works on interference control in addition with the open research directions on PHY and MAC properties and a SWOT (Strengths, Weaknesses, Opportunities, and Threats) analysis presented in Chapter 2 to encourage further study on 5G NB-IoT

    Building upon NB-IoT networks : a roadmap towards 5G new radio networks

    Get PDF
    Narrowband Internet of Things (NB-IoT) is a type of low-power wide-area (LPWA) technology standardized by the 3rd-Generation Partnership Project (3GPP) and based on long-term evolution (LTE) functionalities. NB-IoT has attracted significant interest from the research community due to its support for massive machine-type communication (mMTC) and various IoT use cases that have stringent specifications in terms of connectivity, energy efficiency, reachability, reliability, and latency. However, as the capacity requirements for different IoT use cases continue to grow, the various functionalities of the LTE evolved packet core (EPC) system may become overladen and inevitably suboptimal. Several research efforts are ongoing to meet these challenges; consequently, we present an overview of these efforts, mainly focusing on the Open System Interconnection (OSI) layer of the NB-IoT framework. We present an optimized architecture of the LTE EPC functionalities, as well as further discussion about the 3GPP NB-IoT standardization and its releases. Furthermore, the possible 5G architectural design for NB-IoT integration, the enabling technologies required for 5G NB-IoT, the 5G NR coexistence with NB-IoT, and the potential architectural deployment schemes of NB-IoT with cellular networks are introduced. In this article, a description of cloud-assisted relay with backscatter communication, a comprehensive review of the technical performance properties and channel communication characteristics from the perspective of the physical (PHY) and medium-access control (MAC) layer of NB-IoT, with a focus on 5G, are presented. The different limitations associated with simulating these systems are also discussed. The enabling market for NB-IoT, the benefits for a few use cases, and possible critical challenges related to their deployment are also included. Finally, present challenges and open research directions on the PHY and MAC properties, as well as the strengths, weaknesses, opportunities, and threats (SWOT) analysis of NB-IoT, are presented to foster the prospective research activities.http://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6287639pm2021Electrical, Electronic and Computer Engineerin

    Energy efficiency in short and wide-area IoT technologies—A survey

    Get PDF
    In the last years, the Internet of Things (IoT) has emerged as a key application context in the design and evolution of technologies in the transition toward a 5G ecosystem. More and more IoT technologies have entered the market and represent important enablers in the deployment of networks of interconnected devices. As network and spatial device densities grow, energy efficiency and consumption are becoming an important aspect in analyzing the performance and suitability of different technologies. In this framework, this survey presents an extensive review of IoT technologies, including both Low-Power Short-Area Networks (LPSANs) and Low-Power Wide-Area Networks (LPWANs), from the perspective of energy efficiency and power consumption. Existing consumption models and energy efficiency mechanisms are categorized, analyzed and discussed, in order to highlight the main trends proposed in literature and standards toward achieving energy-efficient IoT networks. Current limitations and open challenges are also discussed, aiming at highlighting new possible research directions

    Direct communication radio Iinterface for new radio multicasting and cooperative positioning

    Get PDF
    Cotutela: Universidad de defensa UNIVERSITA’ MEDITERRANEA DI REGGIO CALABRIARecently, the popularity of Millimeter Wave (mmWave) wireless networks has increased due to their capability to cope with the escalation of mobile data demands caused by the unprecedented proliferation of smart devices in the fifth-generation (5G). Extremely high frequency or mmWave band is a fundamental pillar in the provision of the expected gigabit data rates. Hence, according to both academic and industrial communities, mmWave technology, e.g., 5G New Radio (NR) and WiGig (60 GHz), is considered as one of the main components of 5G and beyond networks. Particularly, the 3rd Generation Partnership Project (3GPP) provides for the use of licensed mmWave sub-bands for the 5G mmWave cellular networks, whereas IEEE actively explores the unlicensed band at 60 GHz for the next-generation wireless local area networks. In this regard, mmWave has been envisaged as a new technology layout for real-time heavy-traffic and wearable applications. This very work is devoted to solving the problem of mmWave band communication system while enhancing its advantages through utilizing the direct communication radio interface for NR multicasting, cooperative positioning, and mission-critical applications. The main contributions presented in this work include: (i) a set of mathematical frameworks and simulation tools to characterize multicast traffic delivery in mmWave directional systems; (ii) sidelink relaying concept exploitation to deal with the channel condition deterioration of dynamic multicast systems and to ensure mission-critical and ultra-reliable low-latency communications; (iii) cooperative positioning techniques analysis for enhancing cellular positioning accuracy for 5G+ emerging applications that require not only improved communication characteristics but also precise localization. Our study indicates the need for additional mechanisms/research that can be utilized: (i) to further improve multicasting performance in 5G/6G systems; (ii) to investigate sideline aspects, including, but not limited to, standardization perspective and the next relay selection strategies; and (iii) to design cooperative positioning systems based on Device-to-Device (D2D) technology

    Enabling Technologies for Internet of Things: Licensed and Unlicensed Techniques

    Get PDF
    The Internet of Things (IoT) is a novel paradigm which is shaping the evolution of the future Internet. According to the vision underlying the IoT, the next step in increasing the ubiquity of the Internet, after connecting people anytime and everywhere, is to connect inanimate objects. By providing objects with embedded communication capabilities and a common addressing scheme, a highly distributed and ubiquitous network of seamlessly connected heterogeneous devices is formed, which can be fully integrated into the current Internet and mobile networks, thus allowing for the development of new intelligent services available anytime, anywhere, by anyone and anything. Such a vision is also becoming known under the name of Machine-to-Machine (M2M), where the absence of human interaction in the system dynamics is further emphasized. A massive number of wireless devices will have the ability to connect to the Internat through the IoT framework. With the accelerating pace of marketing such framework, the new wireless communications standards are studying/proposing solutions to incorporate the services needed for the IoT. However, with an estimate of 30 billion connected devices, a lot of challenges are facing the current wireless technology. In our research, we address a variety of technology candidates for enabling such a massive framework. Mainly, we focus on the nderlay cognitive radio networks as the unlicensed candidate for IoT. On the other hand, we look into the current efforts done by the standardization bodies to accommodate the requirements of the IoT into the current cellular networks. Specifically, we survey the new features and the new user equipment categories added to the physical layer of the LTE-A. In particular, we study the performance of a dual-hop cognitive radio network sharing the spectrum of a primary network in an underlay fashion. In particular, the cognitive network consists of a source, a destination, and multiple nodes employed as amplify-and-forward relays. To improve the spectral efficiency, all relays are allowed to instantaneously transmit to the destination over the same frequency band. We present the optimal power allocation that maximizes the received signal-to-noise ratio (SNR) at the destination while satisfying the interference constrains of the primary network. The optimal power allocation is obtained through an eigen-solution of a channel-dependent matrix, and is shown to transform the transmission over the non-orthogonal relays into parallel channels. Furthermore, while the secondary destination is equipped with multiple antennas, we propose an antenna selection scheme to select the antenna with the highest SNR. To this end, we propose a clustering scheme to subgroup the available relays and use antenna selection at the receiver to extract the same diversity order. We show that random clustering causes the system to lose some of the available degrees of freedom. We provide analytical expression of the outage probability of the system for the random clustering and the proposed maximum-SNR clustering scheme with antenna selection. In addition, we adapt our design to increase the energy-efficiency of the overall network without significant loss in the data rate. In the second part of this thesis, we will look into the current efforts done by the standardization bodies to accommodate the equirements of the IoT into the current cellular networks. Specifically, we present the new features and the new user equipment categories added to the physical layer of the LTE-A. We study some of the challenges facing the LTE-A when dealing with Machine Type communications (MTC). Specifically, the MTC Physical Downlink control channel (MPDCCH) is among the newly introduced features in the LTE-A that carries the downlink control information (DCI) for MTC devices. Correctly decoding the PDCCH, mainly depends on the channel estimation used to compensate for the channel errors during transmission, and the choice of such technique will affect both the complexity and the performance of the user equipment. We propose and assess the performance of a simple channel estimation technique depends in essence on the Least Squares (LS) estimates of the pilots signal and linear interpolations for low-Doppler channels associated with the MTC application

    Performance Comparison Of Weak And Strong Learners In Detecting GPS Spoofing Attacks On Unmanned Aerial Vehicles (uavs)

    Get PDF
    Unmanned Aerial Vehicle systems (UAVs) are widely used in civil and military applications. These systems rely on trustworthy connections with various nodes in their network to conduct their safe operations and return-to-home. These entities consist of other aircrafts, ground control facilities, air traffic control facilities, and satellite navigation systems. Global positioning systems (GPS) play a significant role in UAV\u27s communication with different nodes, navigation, and positioning tasks. However, due to the unencrypted nature of the GPS signals, these vehicles are prone to several cyberattacks, including GPS meaconing, GPS spoofing, and jamming. Therefore, this thesis aims at conducting a detailed comparison of two widely used machine learning techniques, namely weak and strong learners, to investigate their performance in detecting GPS spoofing attacks that target UAVs. Real data are used to generate training datasets and test the effectiveness of machine learning techniques. Various features are derived from this data. To evaluate the performance of the models, seven different evaluation metrics, including accuracy, probabilities of detection and misdetection, probability of false alarm, processing time, prediction time per sample, and memory size, are implemented. The results show that both types of machine learning algorithms provide high detection and low false alarm probabilities. In addition, despite being structurally weaker than strong learners, weak learner classifiers also, achieve a good detection rate. However, the strong learners slightly outperform the weak learner classifiers in terms of multiple evaluation metrics, including accuracy, probabilities of misdetection and false alarm, while weak learner classifiers outperform in terms of time performance metrics

    A Comprehensive Overview on 5G-and-Beyond Networks with UAVs: From Communications to Sensing and Intelligence

    Full text link
    Due to the advancements in cellular technologies and the dense deployment of cellular infrastructure, integrating unmanned aerial vehicles (UAVs) into the fifth-generation (5G) and beyond cellular networks is a promising solution to achieve safe UAV operation as well as enabling diversified applications with mission-specific payload data delivery. In particular, 5G networks need to support three typical usage scenarios, namely, enhanced mobile broadband (eMBB), ultra-reliable low-latency communications (URLLC), and massive machine-type communications (mMTC). On the one hand, UAVs can be leveraged as cost-effective aerial platforms to provide ground users with enhanced communication services by exploiting their high cruising altitude and controllable maneuverability in three-dimensional (3D) space. On the other hand, providing such communication services simultaneously for both UAV and ground users poses new challenges due to the need for ubiquitous 3D signal coverage as well as the strong air-ground network interference. Besides the requirement of high-performance wireless communications, the ability to support effective and efficient sensing as well as network intelligence is also essential for 5G-and-beyond 3D heterogeneous wireless networks with coexisting aerial and ground users. In this paper, we provide a comprehensive overview of the latest research efforts on integrating UAVs into cellular networks, with an emphasis on how to exploit advanced techniques (e.g., intelligent reflecting surface, short packet transmission, energy harvesting, joint communication and radar sensing, and edge intelligence) to meet the diversified service requirements of next-generation wireless systems. Moreover, we highlight important directions for further investigation in future work.Comment: Accepted by IEEE JSA

    Enabling Millimeter Wave Communications for Use Cases of 5G and Beyond Networks

    Get PDF
    The wide bandwidth requirements of the fifth generation (5G) and beyond networks are driving the move to millimeter wave (mmWave) bands where it can provide a huge increase in the available bandwidth. Increasing the bandwidth is an effective way to improve the channel capacity with limited power. Moreover, the short wavelengths of such bands enable massive number of antennas to be integrated together in small areas. With such massive number of antennas, narrow beamwidth beams can be obtained which in turn can improve the security. Furthermore, the massive number of antennas can help in mitigating the severe path-loss at mmWave frequencies, and realize high data rate communication at reasonable distances. Nevertheless, one of the main bottlenecks of mmWave communications is the signal blockage. This is due to weak diffraction ability and severe penetration losses by many common building materials such as brick, and mortar as well as the losses due to human bodies. Thus, user mobility and/or small movements of obstacles and reflectors cause rapid channel gain variations which leads to unreliable communication links. The harsh propagation environment at such high frequencies makes it hard to provide a reliable service, hence, maintaining connectivity is one key design challenge in mmWave networks. Relays represent a promising approach to improve mmWave connectivity where they can redirect the signal to avoid the obstacles existing in the propagation environment. However, routing in mmWave networks is known to be a very challenging problem due to the inherent propagation characteristics of mmWave frequencies. Furthermore, inflexible routing technique may worsen network performance and increase scheduling overhead. As such, designing an appropriate transmission routing technique for each service is a crucial issue in mmWave networks. Indeed, multiple factors must be taken into account in the routing process, such as guaranteeing the robustness of network connectivity and providing high data rates. In this thesis, we propose an analytical framework to investigate the network reliability of mmWave relaying systems for multi-hop transmissions. We also propose a flexible routing technique for mmWave networks, namely the nthn^{\rm th} best routing technique. The performance of the proposed routing technique is investigated using tools from stochastic geometry. The obtained results provide useful insights on adjusting the signal noise ratio (SNR) threshold for decode and forward (DF) relay according to the order of the best relay, blockage and relay densities in order to improve spectral efficiency. We also propose a novel mathematical framework to investigate the performance of two appropriate routing techniques for mmWave networks, namely minimum hop count (MHC) and nearest LoS relay to the destination with MHC (NLR-MHC) to support wide range of use cases for 5G and beyond networks. Analytical models are provided to evaluate the performance of the proposed techniques using tools from stochastic geometry. In doing so, we model the distribution of hop count using phase-type distribution, and then we use this distribution to derive analytical results for the coverage probability and spectral efficiency. Capitalizing on the derived results, we introduce a comprehensive study of the effects of different system parameters on the performance of multi-hop mmWave systems. These findings provide important insights for designing multi-hop mmWave networks with better performance. Furthermore, we adapt the proposed relay selection technique for IoT devices in mmWave relaying systems to prolong the IoT device’s battery life. The obtained results reveal the trade-off between the network connectivity and the energy consumption of IoT devices. Lastly, we have exploited the enormous bandwidth available in the mmWave band to support reliable fronthaul links for cell-free (CF) massive multiple-input multiple-output (MIMO). We provide a comprehensive investigation of different system parameters on the uplink (UL) performance of mmWave fronthaul-based CF mMIMO systems. Results reveal that increasing the access point (AP) density beyond a certain limit would not achieve further improvement in the UL data rates. Also, the higher number of antennas per AP may even cause UL data rates degradation

    Power Beacon’s deployment optimization for wirelessly powering massive Internet of Things networks

    Get PDF
    Abstract. The fifth-generation (5G) and beyond wireless cellular networks promise the native support to, among other use cases, the so-called Internet of Things (IoT). Different from human-based cellular services, IoT networks implement a novel vision where ordinary machines possess the ability to autonomously sense, actuate, compute, and communicate throughout the Internet. However, as the number of connected devices grows larger, an urgent demand for energy-efficient communication technologies arises. A key challenge related to IoT devices is that their very small form factor allows them to carry just a tiny battery that might not be even possible to replace due to installation conditions, or too costly in terms of maintenance because of the massiveness of the network. This issue limits the lifetime of the network and compromises its reliability. Wireless energy transfer (WET) has emerged as a potential candidate to replenish sensors’ batteries or to sustain the operation of battery-free devices, as it provides a controllable source of energy over-the-air. Therefore, WET eliminates the need for regular maintenance, allows sensors’ form factor reduction, and reduces the battery disposal that contributes to the environment pollution. In this thesis, we review some WET-enabled scenarios and state-of-the-art techniques for implementing WET in IoT networks. In particular, we focus our attention on the deployment optimization of the so-called power beacons (PBs), which are the energy transmitters for charging a massive IoT deployment subject to a network-wide probabilistic energy outage constraint. We assume that IoT sensors’ positions are unknown at the PBs, and hence we maximize the average incident power on the worst network location. We propose a linear-time complexity algorithm for optimizing the PBs’ positions that outperforms benchmark methods in terms of minimum average incident power and computation time. Then, we also present some insights on the maximum coverage area under certain propagation conditions
    • …
    corecore