4,227 research outputs found

    Goodbye, ALOHA!

    Get PDF
    ©2016 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.The vision of the Internet of Things (IoT) to interconnect and Internet-connect everyday people, objects, and machines poses new challenges in the design of wireless communication networks. The design of medium access control (MAC) protocols has been traditionally an intense area of research due to their high impact on the overall performance of wireless communications. The majority of research activities in this field deal with different variations of protocols somehow based on ALOHA, either with or without listen before talk, i.e., carrier sensing multiple access. These protocols operate well under low traffic loads and low number of simultaneous devices. However, they suffer from congestion as the traffic load and the number of devices increase. For this reason, unless revisited, the MAC layer can become a bottleneck for the success of the IoT. In this paper, we provide an overview of the existing MAC solutions for the IoT, describing current limitations and envisioned challenges for the near future. Motivated by those, we identify a family of simple algorithms based on distributed queueing (DQ), which can operate for an infinite number of devices generating any traffic load and pattern. A description of the DQ mechanism is provided and most relevant existing studies of DQ applied in different scenarios are described in this paper. In addition, we provide a novel performance evaluation of DQ when applied for the IoT. Finally, a description of the very first demo of DQ for its use in the IoT is also included in this paper.Peer ReviewedPostprint (author's final draft

    Adaptive load control for IoT based on satellite communications

    Get PDF
    The Internet Of Things (IoT) market is growing more and more every year. Today, the number of IoT devices is estimated around 8 billion but forecasts announce 20 billion devices for 2020. Terrestrial or satellite communications systems are already deployed to answer the connectivity need. These systems rely on a Random Access CHannel (RACH) used either to send resource allocation requests or directly the useful message. Because of the number of IoT devices, the overload on the RACH is an emerging issue since it may cause a service outage. This is especially the case for IoT satellite systems because of the wide area covered by a single satellite. The Access Class Barring (ACB) is the load control mechanism used within the Narrow Band IoT. Unfortunately, no method was specified to compute the load control parameters. In this paper, in the context of a satellite IoT system, we propose a method to compute dynamically ACB based load control parameters. Thanks to our method, the load control mechanism reach excellent results regarding transmission reliability and energy consumption for various traffic scenarios

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial

    Performance Analysis of IoTWireless Cellular Systems

    Get PDF
    The Internet of Things (IoT) is becoming a reality and with it comes the need to support more devices with better coverage and low power consumption on the wireless network. One of the Low-Power Wide Wan (LPWA) technologies that aims to meet these requirements is Narrow-Band IoT (NB-IoT). NB-IoT is a 4G cellular technology particularly focused on IoT scenarios demanding for low throughput and very low energy consumption. This dissertation investigates the capacity and performance of NB-IoT technology in real-world scenarios by comparing the results of measurements performed under different radio conditions around Lisbon’s metropolitan area. Inspired by related works presented in the dissertation, the approaches adopted in this work are explained and the metrics collected are described in detail. Through practical measurements campaigns we characterize different metrics of NB-IoT performance for different propagation scenarios, identifying hypothetical causes for the observed performance.A Internet das Coisas (IoT) está a tornar-se uma realidade e com ela surge uma necessidade de albergar mais dispositivos com melhor cobertura e menor consumo de energia nas redes sem fios. Uma das tecnologias Low-Power Wide Area (LPWA) que visa atender a esses requisitos é Narrow Band IoT (NB-IoT). NB-IoT é uma tecnologia celular 4G particularmente focada em cenários de IoT que exigem baixo débito e baixo consumo de energia. Esta dissertação investiga a capacidade e o desempenho da tecnologia NB-IoT em cenários do mundo real, comparando os resultados das medições realizadas sob diferentes condições de rádio na área metropolitana de Lisboa. Tomando como inspiração alguns trabalhos relacionados apresentados na dissertação, as abordagens adotadas neste trabalho são devidamente explicadas e as métricas descritas em detalhes. Através de medições práticas, são caracterizadas diferentes métricas de desempenho de NB-IoT para diferentes cenários de propagação, identificando causas hipotéticas para o desempenho observado

    Analysis of Random Access in NB-IoT Networks with Three Coverage Enhancement Groups: A Stochastic Geometry Approach

    Get PDF
    NarrowBand-Internet of Things (NB-IoT) is a new 3GPP radio access technology designed to provide better coverage for Low Power Wide Area (LPWA) networks. To provide reliable connections with extended coverage, a repetition transmission scheme and up to three Coverage Enhancement (CE) groups are introduced into NB-IoT during both Random Access CHannel (RACH) procedure and data transmission procedure, where each CE group is configured with different repetition values and transmission resources. To characterize the RACH performance of the NB-IoT network with three CE groups, this paper develops a novel traffic-aware spatio-temporal model to analyze the RACH success probability, where both the preamble transmission outage and the collision events of each CE group jointly determine the traffic evolution and the RACH success probability. Based on this analytical model, we derive the analytical expression for the RACH success probability of a randomly chosen IoT device in each CE group over multiple time slots with different RACH schemes, including baseline, back-off (BO), access class barring (ACB), and hybrid ACB and BO schemes (ACB&BO). Our results have shown that the RACH success probabilities of the devices in three CE groups outperform that of a single CE group network but not for all the groups, which is affected by the choice of the categorizing parameters.This mathematical model and analytical framework can be applied to evaluate the performance of multiple group users of other networks with spatial separations

    Enabling Technologies for Ultra-Reliable and Low Latency Communications: From PHY and MAC Layer Perspectives

    Full text link
    © 1998-2012 IEEE. Future 5th generation networks are expected to enable three key services-enhanced mobile broadband, massive machine type communications and ultra-reliable and low latency communications (URLLC). As per the 3rd generation partnership project URLLC requirements, it is expected that the reliability of one transmission of a 32 byte packet will be at least 99.999% and the latency will be at most 1 ms. This unprecedented level of reliability and latency will yield various new applications, such as smart grids, industrial automation and intelligent transport systems. In this survey we present potential future URLLC applications, and summarize the corresponding reliability and latency requirements. We provide a comprehensive discussion on physical (PHY) and medium access control (MAC) layer techniques that enable URLLC, addressing both licensed and unlicensed bands. This paper evaluates the relevant PHY and MAC techniques for their ability to improve the reliability and reduce the latency. We identify that enabling long-term evolution to coexist in the unlicensed spectrum is also a potential enabler of URLLC in the unlicensed band, and provide numerical evaluations. Lastly, this paper discusses the potential future research directions and challenges in achieving the URLLC requirements

    Sub-GHz LPWAN network coexistence, management and virtualization : an overview and open research challenges

    Get PDF
    The IoT domain is characterized by many applications that require low-bandwidth communications over a long range, at a low cost and at low power. Low power wide area networks (LPWANs) fulfill these requirements by using sub-GHz radio frequencies (typically 433 or 868 MHz) with typical transmission ranges in the order of 1 up to 50 km. As a result, a single base station can cover large areas and can support high numbers of connected devices (> 1000 per base station). Notorious initiatives in this domain are LoRa, Sigfox and the upcoming IEEE 802.11ah (or "HaLow") standard. Although these new technologies have the potential to significantly impact many IoT deployments, the current market is very fragmented and many challenges exists related to deployment, scalability, management and coexistence aspects, making adoption of these technologies difficult for many companies. To remedy this, this paper proposes a conceptual framework to improve the performance of LPWAN networks through in-network optimization, cross-technology coexistence and cooperation and virtualization of management functions. In addition, the paper gives an overview of state of the art solutions and identifies open challenges for each of these aspects
    • …
    corecore