1,217 research outputs found

    Spectrum Assignment in Hardware-Constrained Cognitive Radio IoT Networks Under Varying Channel-Quality Conditions

    Full text link
    [EN] The integration of cognitive radio (CR) technology with the future Internet-of-Things (IoT) architecture is expected to allow effective massive IoT deployment by providing huge spectrum opportunities to the IoT devices. Several communication protocols have been proposed for the CR networks while ignoring the adjacent channel interference (ACI) problem by assuming sharp filters at the transmit and receive chains of each CR device. However, in practice, such an assumption is not feasible for low-cost hardware-constrained CR-capable IoT (CR-IoT) devices. Specifically, when a large number of CR-IoT devices are operating in the same vicinity, guard-band channels (GBs) are needed to mitigate the ACI problem, introducing GB adds constraints on the efficient use of spectrum and protocol design. In this paper, we develop a channel assignment mechanism for the hardware-constrained CR-IoT networks under time-varying channel conditions with GB-awareness. The objective of our assignment is to serve the largest possible number of CR-IoT devices by assigning the least number of idle channels to each device subject to rate demand and interference constraints. The proposed channel assignment in this paper is conducted on a per-block basis for the contending CR-IoT devices while considering the time-varying channel conditions for each CRIoT transmission over each idle channel, such that spectrum efficiency is improved. Specifically, our channel assignment problem is formulated as a binary linear programming problem, which is NP-hard. Thus, we propose a polynomial-time solution using a sequential fixing algorithm that achieves a suboptimal solution. The simulation results demonstrate that our proposed assignment provides significant increase in the number of served IoT devices over existing assignment mechanisms.This work was supported in part by the QR Global Challenges Research Fund, Staffordshire University, Staffordshire, U.K.Salameh, HAB.; Al-Masri, S.; Benkhelifa, E.; Lloret, J. (2019). Spectrum Assignment in Hardware-Constrained Cognitive Radio IoT Networks Under Varying Channel-Quality Conditions. IEEE Access. 7:42816-42825. https://doi.org/10.1109/ACCESS.2019.2901902S4281642825

    Spectrum Assignment in Hardware-constrained Cognitive Radio IoT Networks under Varying Channel-quality Conditions

    Get PDF
    ABSTRACT The integration of cognitive radio (CR) technology with future Internet-of-Things (IoT) architectures is expected to allow effective massive IoT deployment by providing huge spectrum opportunities to IoT devices. Several communication protocols have been proposed for CR networks while ignoring the adjacent channel interference (ACI) problem by assuming sharp filters at the transmit and receive chains of each CR device. However, in practice, such an assumption is not feasible for low-cost hardware-constrained CR-capable IoT (CR-IoT) devices. Specifically, when large number of CR-IoT devices are operating in the same vicinity, guardband channels (GBs) are needed to mitigate the ACI problem. Introducing GB constraint spectrum efficiency and protocol design. In this paper, we develop a channel assignment mechanism for hardware-constrained CR-IoT networks under time-varying channel conditions with GB-awareness. The objective of our assignment is to serve the largest possible number of CR-IoT devices by assigning the least number of idle channels to each device subject to rate demand and interference constraints. The proposed channel assignment in this paper is conducted on a per-block basis for the contending CR-IoT devices while considering the time-varying channel conditions for each CRIoT transmission over each idle channel such that spectrum efficiency is improved. Specifically, our channel assignment problem is formulated as a binary linear programming (BLP) problem, which is NP hard. Thus, we propose a polynomial-time solution using a sequential fixing algorithm that achieves a suboptimal solution. Simulation results demonstrate that our proposed assignment provides significant increase in the number of served IoT devices over existing assignment mechanisms

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig

    Versatility Of Low-Power Wide-Area Network Applications

    Get PDF
    Low-Power Wide-Area Network (LPWAN) is regarded as the leading communication technology for wide-area Internet-of-Things (IoT) applications. It offers low-power, long-range, and low-cost communication. With different communication requirements for varying IoT applications, many competing LPWAN technologies operating in both licensed (e.g., NB-IoT, LTE-M, and 5G) and unlicensed (e.g., LoRa and SigFox) bands have emerged. LPWANs are designed to support applications with low-power and low data rate operations. They are not well-designed to host applications that involve high mobility, high traffic, or real-time communication (e.g., volcano monitoring and control applications).With the increasing number of mobile devices in many IoT domains (e.g., agricultural IoT and smart city), mobility support is not well-addressed in LPWAN. Cellular-based/licensed LPWAN relies on the wired infrastructure to enable mobility. On the other hand, most unlicensed LPWANs operate on the crowded ISM band or are required to duty cycle, making handling mobility a challenge. In this dissertation, we first identify the key opportunities of LPWAN, highlight the challenges, and show potential directions for future research. We then enable the versatility of LPWAN applications first by enabling applications involving mobility over LPWAN. Specifically, we propose to handle mobility in LPWAN over white space considering Sensor Network Over White Space (SNOW). SNOW is a highly scalable and energy-efficient LPWAN operating over the TV white spaces. TV white spaces are the allocated but locally unused available TV channels (54 - 698 MHz in the US). We proposed a dynamic Carrier Frequency Offset (CFO) estimation and compensation technique that considers the impact of the Doppler shift due to mobility. Also, we design energy-efficient and fast BS discovery and association approaches. Finally, we demonstrate the feasibility of our approach through experiments in different deployments. Finally, we present a collision detection and recovery technique called RnR (Reverse & Replace Decoding) that applies to LPWANs. Additionally, we discuss future work to enable handling burst transmission over LPWAN and localization in mobile LPWAN

    A Framework for Enhancing the Energy Efficiency of IoT Devices in 5G Network

    Get PDF
    A wide range of services, such as improved mobile broadband, extensive machine-type communication, ultra-reliability, and low latency, are anticipated to be delivered via the 5G network. The 5G network has developed as a multi-layer network that uses numerous technological advancements to provide a wide array of wireless services to fulfil such a diversified set of requirements. Several technologies, including software-defined networking, network function virtualization, edge computing, cloud computing, and tiny cells, are being integrated into the 5G networks to meet the needs of various requirements. Due to the higher power consumption that will arise from such a complicated network design, energy efficiency becomes crucial. The network machine learning technique has attracted a lot of interest from the scientific community because it has the potential to play a crucial role in helping to achieve energy efficiency. Utilization factor, access latency, arrival rate, and other metrics are used to study the proposed scheme. It is determined that our system outperforms the present scheme after comparing the suggested scheme to these parameters

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial
    • …
    corecore