19 research outputs found

    AN EFFICIENT INTERFERENCE AVOIDANCE SCHEME FOR DEVICE-TODEVICE ENABLED FIFTH GENERATION NARROWBAND INTERNET OF THINGS NETWOKS’

    Get PDF
    Narrowband Internet of Things (NB-IoT) is a low-power wide-area (LPWA) technology built on long-term evolution (LTE) functionalities and standardized by the 3rd-Generation Partnership Project (3GPP). Due to its support for massive machine-type communication (mMTC) and different IoT use cases with rigorous standards in terms of connection, energy efficiency, reachability, reliability, and latency, NB-IoT has attracted the research community. However, as the capacity needs for various IoT use cases expand, the LTE evolved packet core (EPC) system's numerous functionalities may become overburdened and suboptimal. Several research efforts are currently in progress to address these challenges. As a result, an overview of these efforts with a specific focus on the optimized architecture of the LTE EPC functionalities, the 5G architectural design for NB-IoT integration, the enabling technologies necessary for 5G NB-IoT, 5G new radio (NR) coexistence with NB-IoT, and feasible architectural deployment schemes of NB-IoT with cellular networks is discussed. This thesis also presents cloud-assisted relay with backscatter communication as part of a detailed study of the technical performance attributes and channel communication characteristics from the physical (PHY) and medium access control (MAC) layers of the NB-IoT, with a focus on 5G. The numerous drawbacks that come with simulating these systems are explored. The enabling market for NB-IoT, the benefits for a few use cases, and the potential critical challenges associated with their deployment are all highlighted. Fortunately, the cyclic prefix orthogonal frequency division multiplexing (CPOFDM) based waveform by 3GPP NR for improved mobile broadband (eMBB) services does not prohibit the use of other waveforms in other services, such as the NB-IoT service for mMTC. As a result, the coexistence of 5G NR and NB-IoT must be manageably orthogonal (or quasi-orthogonal) to minimize mutual interference that limits the form of freedom in the waveform's overall design. As a result, 5G coexistence with NB-IoT will introduce a new interference challenge, distinct from that of the legacy network, even though the NR's coexistence with NB-IoT is believed to improve network capacity and expand the coverage of the user data rate, as well as improves robust communication through frequency reuse. Interference challenges may make channel estimation difficult for NB-IoT devices, limiting the user performance and spectral efficiency. Various existing interference mitigation solutions either add to the network's overhead, computational complexity and delay or are hampered by low data rate and coverage. These algorithms are unsuitable for an NB-IoT network owing to the low-complexity nature. As a result, a D2D communication based interference-control technique becomes an effective strategy for addressing this problem. This thesis used D2D communication to decrease the network bottleneck in dense 5G NBIoT networks prone to interference. For D2D-enabled 5G NB-IoT systems, the thesis presents an interference-avoidance resource allocation that considers the less favourable cell edge NUEs. To simplify the algorithm's computing complexity and reduce interference power, the system divides the optimization problem into three sub-problems. First, in an orthogonal deployment technique using channel state information (CSI), the channel gain factor is leveraged by selecting a probable reuse channel with higher QoS control. Second, a bisection search approach is used to find the best power control that maximizes the network sum rate, and third, the Hungarian algorithm is used to build a maximum bipartite matching strategy to choose the optimal pairing pattern between the sets of NUEs and the D2D pairs. The proposed approach improves the D2D sum rate and overall network SINR of the 5G NB-IoT system, according to the numerical data. The maximum power constraint of the D2D pair, D2D's location, Pico-base station (PBS) cell radius, number of potential reuse channels, and cluster distance impact the D2D pair's performance. The simulation results achieve 28.35%, 31.33%, and 39% SINR performance higher than the ARSAD, DCORA, and RRA algorithms when the number of NUEs is twice the number of D2D pairs, and 2.52%, 14.80%, and 39.89% SINR performance higher than the ARSAD, RRA, and DCORA when the number of NUEs and D2D pairs are equal. As a result, a D2D sum rate increase of 9.23%, 11.26%, and 13.92% higher than the ARSAD, DCORA, and RRA when the NUE’s number is twice the number of D2D pairs, and a D2D’s sum rate increase of 1.18%, 4.64% and 15.93% higher than the ARSAD, RRA and DCORA respectively, with an equal number of NUEs and D2D pairs is achieved. The results demonstrate the efficacy of the proposed scheme. The thesis also addressed the problem where the cell-edge NUE's QoS is critical to challenges such as long-distance transmission, delays, low bandwidth utilization, and high system overhead that affect 5G NB-IoT network performance. In this case, most cell-edge NUEs boost their transmit power to maximize network throughput. Integrating cooperating D2D relaying technique into 5G NB-IoT heterogeneous network (HetNet) uplink spectrum sharing increases the system's spectral efficiency and interference power, further degrading the network. Using a max-max SINR (Max-SINR) approach, this thesis proposed an interference-aware D2D relaying strategy for 5G NB-IoT QoS improvement for a cell-edge NUE to achieve optimum system performance. The Lagrangian-dual technique is used to optimize the transmit power of the cell-edge NUE to the relay based on the average interference power constraint, while the relay to the NB-IoT base station (NBS) employs a fixed transmit power. To choose an optimal D2D relay node, the channel-to-interference plus noise ratio (CINR) of all available D2D relays is used to maximize the minimum cell-edge NUE's data rate while ensuring the cellular NUEs' QoS requirements are satisfied. Best harmonic mean, best-worst, half-duplex relay selection, and a D2D communication scheme were among the other relaying selection strategies studied. The simulation results reveal that the Max-SINR selection scheme outperforms all other selection schemes due to the high channel gain between the two communication devices except for the D2D communication scheme. The proposed algorithm achieves 21.27% SINR performance, which is nearly identical to the half-duplex scheme, but outperforms the best-worst and harmonic selection techniques by 81.27% and 40.29%, respectively. As a result, as the number of D2D relays increases, the capacity increases by 14.10% and 47.19%, respectively, over harmonic and half-duplex techniques. Finally, the thesis presents future research works on interference control in addition with the open research directions on PHY and MAC properties and a SWOT (Strengths, Weaknesses, Opportunities, and Threats) analysis presented in Chapter 2 to encourage further study on 5G NB-IoT

    Cooperative Uplink Inter-Cell Interference (ICI) Mitigation in 5G Networks

    Get PDF
    In order to support the new paradigm shift in fifth generation (5G) mobile communication, radically different network architectures, associated technologies and network operation algorithms, need to be developed compared to existing fourth generation (4G) cellular solutions. The evolution toward 5G mobile networks will be characterized by an increasing number of wireless devices, increasing device and service complexity, and the requirement to access mobile services ubiquitously. To realise the dramatic increase in data rates in particular, research is focused on improving the capacity of current, Long Term Evolution (LTE)-based, 4G network standards, before radical changes are exploited which could include acquiring additional spectrum. The LTE network has a reuse factor of one; hence neighbouring cells/sectors use the same spectrum, therefore making the cell-edge users vulnerable to heavy inter cell interference in addition to the other factors such as fading and path-loss. In this direction, this thesis focuses on improving the performance of cell-edge users in LTE and LTE-Advanced networks by initially implementing a new Coordinated Multi-Point (CoMP) technique to support future 5G networks using smart antennas to mitigate cell-edge user interference in uplink. Successively a novel cooperative uplink inter-cell interference mitigation algorithm based on joint reception at the base station using receiver adaptive beamforming is investigated. Subsequently interference mitigation in a heterogeneous environment for inter Device-to-Device (D2D) communication underlaying cellular network is investigated as the enabling technology for maximising resource block (RB) utilisation in emerging 5G networks. The proximity of users in a network, achieving higher data rates with maximum RB utilisation (as the technology reuses the cellular RB simultaneously), while taking some load off the evolved Node B (eNodeB) i.e. by direct communication between User Equipment (UE), has been explored. Simulation results show that the proximity and transmission power of D2D transmission yields high performance gains for D2D receivers, which was demonstrated to be better than that of cellular UEs with better channel conditions or in close proximity to the eNodeB in the network. It is finally demonstrated that the application, as an extension to the above, of a novel receiver beamforming technique to reduce interference from D2D users, can further enhance network performance. To be able to develop the aforementioned technologies and evaluate the performance of new algorithms in emerging network scenarios, a beyond the-state-of-the-art LTE system-level-simulator (SLS) was implemented. The new simulator includes Multiple-Input Multiple-Output (MIMO) antenna functionalities, comprehensive channel models (such as Wireless World initiative New Radio II i.e. WINNER II) and adaptive modulation and coding schemes to accurately emulate the LTE and LTE-A network standards

    Enabling Technologies for Ultra-Reliable and Low Latency Communications: From PHY and MAC Layer Perspectives

    Full text link
    © 1998-2012 IEEE. Future 5th generation networks are expected to enable three key services-enhanced mobile broadband, massive machine type communications and ultra-reliable and low latency communications (URLLC). As per the 3rd generation partnership project URLLC requirements, it is expected that the reliability of one transmission of a 32 byte packet will be at least 99.999% and the latency will be at most 1 ms. This unprecedented level of reliability and latency will yield various new applications, such as smart grids, industrial automation and intelligent transport systems. In this survey we present potential future URLLC applications, and summarize the corresponding reliability and latency requirements. We provide a comprehensive discussion on physical (PHY) and medium access control (MAC) layer techniques that enable URLLC, addressing both licensed and unlicensed bands. This paper evaluates the relevant PHY and MAC techniques for their ability to improve the reliability and reduce the latency. We identify that enabling long-term evolution to coexist in the unlicensed spectrum is also a potential enabler of URLLC in the unlicensed band, and provide numerical evaluations. Lastly, this paper discusses the potential future research directions and challenges in achieving the URLLC requirements

    20 Years of Evolution from Cognitive to Intelligent Communications

    Full text link
    It has been 20 years since the concept of cognitive radio (CR) was proposed, which is an efficient approach to provide more access opportunities to connect massive wireless devices. To improve the spectrum efficiency, CR enables unlicensed usage of licensed spectrum resources. It has been regarded as the key enabler for intelligent communications. In this article, we will provide an overview on the intelligent communication in the past two decades to illustrate the revolution of its capability from cognition to artificial intelligence (AI). Particularly, this article starts from a comprehensive review of typical spectrum sensing and sharing, followed by the recent achievements on the AI-enabled intelligent radio. Moreover, research challenges in the future intelligent communications will be discussed to show a path to the real deployment of intelligent radio. After witnessing the glorious developments of CR in the past 20 years, we try to provide readers a clear picture on how intelligent radio could be further developed to smartly utilize the limited spectrum resources as well as to optimally configure wireless devices in the future communication systems.Comment: The paper has been accepted by IEEE Transactions on Cognitive Communications and Networkin

    Optimization Modeling and Machine Learning Techniques Towards Smarter Systems and Processes

    Get PDF
    The continued penetration of technology in our daily lives has led to the emergence of the concept of Internet-of-Things (IoT) systems and networks. An increasing number of enterprises and businesses are adopting IoT-based initiatives expecting that it will result in higher return on investment (ROI) [1]. However, adopting such technologies poses many challenges. One challenge is improving the performance and efficiency of such systems by properly allocating the available and scarce resources [2, 3]. A second challenge is making use of the massive amount of data generated to help make smarter and more informed decisions [4]. A third challenge is protecting such devices and systems given the surge in security breaches and attacks in recent times [5]. To that end, this thesis proposes the use of various optimization modeling and machine learning techniques in three different systems; namely wireless communication systems, learning management systems (LMSs), and computer network systems. In par- ticular, the first part of the thesis posits optimization modeling techniques to improve the aggregate throughput and power efficiency of a wireless communication network. On the other hand, the second part of the thesis proposes the use of unsupervised machine learning clustering techniques to be integrated into LMSs to identify unengaged students based on their engagement with material in an e-learning environment. Lastly, the third part of the thesis suggests the use of exploratory data analytics, unsupervised machine learning clustering, and supervised machine learning classification techniques to identify malicious/suspicious domain names in a computer network setting. The main contributions of this thesis can be divided into three broad parts. The first is developing optimal and heuristic scheduling algorithms that improve the performance of wireless systems in terms of throughput and power by combining wireless resource virtualization with device-to-device and machine-to-machine communications. The second is using unsupervised machine learning clustering and association algorithms to determine an appropriate engagement level model for blended e-learning environments and study the relationship between engagement and academic performance in such environments. The third is developing a supervised ensemble learning classifier to detect malicious/suspicious domain names that achieves high accuracy and precision

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig
    corecore