96 research outputs found

    Goodbye, ALOHA!

    Get PDF
    ©2016 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.The vision of the Internet of Things (IoT) to interconnect and Internet-connect everyday people, objects, and machines poses new challenges in the design of wireless communication networks. The design of medium access control (MAC) protocols has been traditionally an intense area of research due to their high impact on the overall performance of wireless communications. The majority of research activities in this field deal with different variations of protocols somehow based on ALOHA, either with or without listen before talk, i.e., carrier sensing multiple access. These protocols operate well under low traffic loads and low number of simultaneous devices. However, they suffer from congestion as the traffic load and the number of devices increase. For this reason, unless revisited, the MAC layer can become a bottleneck for the success of the IoT. In this paper, we provide an overview of the existing MAC solutions for the IoT, describing current limitations and envisioned challenges for the near future. Motivated by those, we identify a family of simple algorithms based on distributed queueing (DQ), which can operate for an infinite number of devices generating any traffic load and pattern. A description of the DQ mechanism is provided and most relevant existing studies of DQ applied in different scenarios are described in this paper. In addition, we provide a novel performance evaluation of DQ when applied for the IoT. Finally, a description of the very first demo of DQ for its use in the IoT is also included in this paper.Peer ReviewedPostprint (author's final draft

    Cellular networks for smart grid communication

    Get PDF
    The next-generation electric power system, known as smart grid, relies on a robust and reliable underlying communication infrastructure to improve the efficiency of electricity distribution. Cellular networks, e.g., LTE/LTE-A systems, appear as a promising technology to facilitate the smart grid evolution. Their inherent performance characteristics and well-established ecosystem could potentially unlock unprecedented use cases, enabling real-time and autonomous distribution grid operations. However, cellular technology was not originally intended for smart grid communication, associated with highly-reliable message exchange and massive device connectivity requirements. The fundamental differences between smart grid and human-type communication challenge the classical design of cellular networks and introduce important research questions that have not been sufficiently addressed so far. Motivated by these challenges, this doctoral thesis investigates novel radio access network (RAN) design principles and performance analysis for the seamless integration of smart grid traffic in future cellular networks. Specifically, we focus on addressing the fundamental RAN problems of network scalability in massive smart grid deployments and radio resource management for smart grid and human-type traffic. The main objective of the thesis lies on the design, analysis and performance evaluation of RAN mechanisms that would render cellular networks the key enabler for emerging smart grid applications. The first part of the thesis addresses the radio access limitations in LTE-based networks for reliable and scalable smart grid communication. We first identify the congestion problem in LTE random access that arises in large-scale smart grid deployments. To overcome this, a novel random access mechanism is proposed that can efficiently support real-time distribution automation services with negligible impact on the background traffic. Motivated by the stringent reliability requirements of various smart grid operations, we then develop an analytical model of the LTE random access procedure that allows us to assess the performance of event-based monitoring traffic under various load conditions and network configurations. We further extend our analysis to include the relation between the cell size and the availability of orthogonal random access resources and we identify an additional challenge for reliable smart grid connectivity. To this end, we devise an interference- and load-aware cell planning mechanism that enhances reliability in substation automation services. Finally, we couple the problem of state estimation in wide-area monitoring systems with the reliability challenges in information acquisition. Using our developed analytical framework, we quantify the impact of imperfect communication reliability in the state estimation accuracy and we provide useful insights for the design of reliability-aware state estimators. The second part of the thesis builds on the previous one and focuses on the RAN problem of resource scheduling and sharing for smart grid and human-type traffic. We introduce a novel scheduler that achieves low latency for distribution automation traffic while resource allocation is performed in a way that keeps the degradation of cellular users at a minimum level. In addition, we investigate the benefits of Device-to-Device (D2D) transmission mode for event-based message exchange in substation automation scenarios. We design a joint mode selection and resource allocation mechanism which results in higher data rates with respect to the conventional transmission mode via the base station. An orthogonal resource partition scheme between cellular and D2D links is further proposed to prevent the underutilization of the scarce cellular spectrum. The research findings of this thesis aim to deliver novel solutions to important RAN performance issues that arise when cellular networks support smart grid communication.Las redes celulares, p.e., los sistemas LTE/LTE-A, aparecen como una tecnología prometedora para facilitar la evolución de la próxima generación del sistema eléctrico de potencia, conocido como smart grid (SG). Sin embargo, la tecnología celular no fue pensada originalmente para las comunicaciones en la SG, asociadas con el intercambio fiable de mensajes y con requisitos de conectividad de un número masivo de dispositivos. Las diferencias fundamentales entre las comunicaciones en la SG y la comunicación de tipo humano desafían el diseño clásico de las redes celulares e introducen importantes cuestiones de investigación que hasta ahora no se han abordado suficientemente. Motivada por estos retos, esta tesis doctoral investiga los principios de diseño y analiza el rendimiento de una nueva red de acceso radio (RAN) que permita una integración perfecta del tráfico de la SG en las redes celulares futuras. Nos centramos en los problemas fundamentales de escalabilidad de la RAN en despliegues de SG masivos, y en la gestión de los recursos radio para la integración del tráfico de la SG con el tráfico de tipo humano. El objetivo principal de la tesis consiste en el diseño, el análisis y la evaluación del rendimiento de los mecanismos de las RAN que convertirán a las redes celulares en el elemento clave para las aplicaciones emergentes de las SGs. La primera parte de la tesis aborda las limitaciones del acceso radio en redes LTE para la comunicación fiable y escalable en SGs. En primer lugar, identificamos el problema de congestión en el acceso aleatorio de LTE que aparece en los despliegues de SGs a gran escala. Para superar este problema, se propone un nuevo mecanismo de acceso aleatorio que permite soportar de forma eficiente los servicios de automatización de la distribución eléctrica en tiempo real, con un impacto insignificante en el tráfico de fondo. Motivados por los estrictos requisitos de fiabilidad de las diversas operaciones en la SG, desarrollamos un modelo analítico del procedimiento de acceso aleatorio de LTE que nos permite evaluar el rendimiento del tráfico de monitorización de la red eléctrica basado en eventos bajo diversas condiciones de carga y configuraciones de red. Además, ampliamos nuestro análisis para incluir la relación entre el tamaño de celda y la disponibilidad de recursos de acceso aleatorio ortogonales, e identificamos un reto adicional para la conectividad fiable en la SG. Con este fin, diseñamos un mecanismo de planificación celular que tiene en cuenta las interferencias y la carga de la red, y que mejora la fiabilidad en los servicios de automatización de las subestaciones eléctricas. Finalmente, combinamos el problema de la estimación de estado en sistemas de monitorización de redes eléctricas de área amplia con los retos de fiabilidad en la adquisición de la información. Utilizando el modelo analítico desarrollado, cuantificamos el impacto de la baja fiabilidad en las comunicaciones sobre la precisión de la estimación de estado. La segunda parte de la tesis se centra en el problema de scheduling y compartición de recursos en la RAN para el tráfico de SG y el tráfico de tipo humano. Presentamos un nuevo scheduler que proporciona baja latencia para el tráfico de automatización de la distribución eléctrica, mientras que la asignación de recursos se realiza de un modo que mantiene la degradación de los usuarios celulares en un nivel mínimo. Además, investigamos los beneficios del modo de transmisión Device-to-Device (D2D) en el intercambio de mensajes basados en eventos en escenarios de automatización de subestaciones eléctricas. Diseñamos un mecanismo conjunto de asignación de recursos y selección de modo que da como resultado tasas de datos más elevadas con respecto al modo de transmisión convencional a través de la estación base. Finalmente, se propone un esquema de partición de recursos ortogonales entre enlaces celulares y D2Postprint (published version

    D2D-Based Grouped Random Access to Mitigate Mobile Access Congestion in 5G Sensor Networks

    Full text link
    The Fifth Generation (5G) wireless service of sensor networks involves significant challenges when dealing with the coordination of ever-increasing number of devices accessing shared resources. This has drawn major interest from the research community as many existing works focus on the radio access network congestion control to efficiently manage resources in the context of device-to-device (D2D) interaction in huge sensor networks. In this context, this paper pioneers a study on the impact of D2D link reliability in group-assisted random access protocols, by shedding the light on beneficial performance and potential limitations of approaches of this kind against tunable parameters such as group size, number of sensors and reliability of D2D links. Additionally, we leverage on the association with a Geolocation Database (GDB) capability to assist the grouping decisions by drawing parallels with recent regulatory-driven initiatives around GDBs and arguing benefits of the suggested proposal. Finally, the proposed method is approved to significantly reduce the delay over random access channels, by means of an exhaustive simulation campaign.Comment: First submission to IEEE Communications Magazine on Oct.28.2017. Accepted on Aug.18.2019. This is the camera-ready versio

    Random Access Analysis for Massive IoT Networks Under a New Spatio-Temporal Model: A Stochastic Geometry Approach

    Get PDF
    Massive Internet of Things (mIoT) has provided an auspicious opportunity to build powerful and ubiquitous connections that faces a plethora of new challenges, where cellular networks are potential solutions due to their high scalability, reliability, and efficiency. The Random Access CHannel (RACH) procedure is the first step of connection establishment between IoT devices and Base Stations (BSs) in the cellular-based mIoT network, where modelling the interactions between static properties of physical layer network and dynamic properties of queue evolving in each IoT device are challenging. To tackle this, we provide a novel traffic-aware spatio-temporal model to analyze RACH in cellular-based mIoT networks, where the physical layer network is modelled and analyzed based on stochastic geometry in the spatial domain, and the queue evolution is analyzed based on probability theory in the time domain. For performance evaluation, we derive the exact expressions for the preamble transmission success probabilities of a randomly chosen IoT device with different RACH schemes in each time slot, which offer insights into effectiveness of each RACH scheme. Our derived analytical results are verified by the realistic simulations capturing the evolution of packets in each IoT device. This mathematical model and analytical framework can be applied to evaluate the performance of other types of RACH schemes in the cellular-based networks by simply integrating its preamble transmission principle

    Towards efficient support for massive Internet of Things over cellular networks

    Get PDF
    The usage of Internet of Things (IoT) devices over cellular networks is seeing tremendous growth in recent years, and that growth in only expected to increase in the near future. While existing 4G and 5G cellular networks offer several desirable features for this type of applications, their design has historically focused on accommodating traditional mobile devices (e.g. smartphones). As IoT devices have very different characteristics and use cases, they create a range of problems to current networks which often struggle to accommodate them at scale. Although newer cellular network technologies, such as Narrowband-IoT (NB-IoT), were designed to focus on the IoT characteristics, they were extensively based on 4G and 5G networks to preserve interoperability, and decrease their deployment cost. As such, several inefficiencies of 4G/5G were also carried over to the newer technologies. This thesis focuses on identifying the core issues that hinder the large scale deployment of IoT over cellular networks, and proposes novel protocols to largely alleviate them. We find that the most significant challenges arise mainly in three distinct areas: connection establishment, network resource utilisation and device energy efficiency. Specifically, we make the following contributions. First, we focus on the connection establishment process and argue that the current procedures, when used by IoT devices, result in increased numbers of collisions, network outages and a signalling overhead that is disproportionate to the size of the data transmitted, and the connection duration of IoT devices. Therefore, we propose two mechanisms to alleviate these inefficiencies. Our first mechanism, named ASPIS, focuses on both the number of collisions and the signalling overhead simultaneously, and provides enhancements to increase the number of successful IoT connections, without disrupting existing background traffic. Our second mechanism focuses specifically on the collisions at the connection establishment process, and used a novel approach with Reinforcement Learning, to decrease their number and allow a larger number of IoT devices to access the network with fewer attempts. Second, we propose a new multicasting mechanism to reduce network resource utilisation in NB-IoT networks, by delivering common content (e.g. firmware updates) to multiple similar devices simultaneously. Notably, our mechanism is both more efficient during multicast data transmission, but also frees up resources that would otherwise be perpetually reserved for multicast signalling under the existing scheme. Finally, we focus on energy efficiency and propose novel protocols that are designed for the unique usage characteristics of NB-IoT devices, in order to reduce the device power consumption. Towards this end, we perform a detailed energy consumption analysis, which we use as a basis to develop an energy consumption model for realistic energy consumption assessment. We then take the insights from our analysis, and propose optimisations to significantly reduce the energy consumption of IoT devices, and assess their performance

    Enhanced Mobile Networking using Multi-connectivity and Packet Duplication in Next-Generation Cellular Networks

    Get PDF
    Modern cellular communication systems need to handle an enormous number of users and large amounts of data, including both users as well as system-oriented data. 5G is the fifth-generation mobile network and a new global wireless standard that follows 4G/LTE networks. The uptake of 5G is expected to be faster than any previous cellular generation, with high expectations of its future impact on the global economy. The next-generation 5G networks are designed to be flexible enough to adapt to modern use cases and be highly modular such that operators would have the flexibility to provide selective features based on user demand that could be implemented without investment in additional infrastructure. Thus, the underlying cellular network that is capable of delivering these expectations must be able to handle high data rates with low latency and ultra-reliability to fulfill these growing needs. Communication in the sub-6 GHz range cannot provide high throughputs due to the scarcity of spectrum in these bands. Using frequencies in FR2 or millimeter wave (mmWave) range for communication can provide large data rates and cover densely populated areas, but only over short distances as they are susceptible to blockages. This is why dense deployments of mmWave base stations are being considered to achieve very high data rates. But, such architectures lack the reliability needed to support many V2X applications, especially under mobility scenarios. As we have discussed earlier, 5G and beyond 5G networks must also account for UE\u27s mobility as they are expected to maintain their level of performance under different mobility scenarios and perform better than traditional networks. Although 5G technology has developed significantly in recent years, there still exists a critical gap in understanding how all these technologies would perform under mobility. There is a need to analyze and identify issues that arise with mobility and come up with solutions to overcome these hurdles without compromising the performance of these networks. Multi-connectivity (MC) refers to simultaneous connectivity with multiple radio access technologies or bands and potentially represents an important solution for the ongoing 5G deployments towards improving their performance. To address the network issues that come with mobility and fill that gap, this dissertation investigates the impact of multi-connectivity on next-generation networks from three distinct perspectives, 1) mobility enhancement using multi-connectivity in 5G networks, 2) improving reliability in mobility scenarios using multi-Connectivity with packet duplication, and 3) single grant multiple uplink scheme for performance improvement in mobility scenarios. The traditional macro-cell architecture of cellular networks that cover large geographical areas will struggle to deliver the dense coverage, low latency, and high bandwidth required by some 5G applications. Thus, 5G networks must utilize ultra-dense deployment of access points operating at higher mmWave frequency bands. But, for such dense networks, user mobility could be particularly challenging as it would reduce network efficiency and user-perceived service quality due to frequent handoffs. Multi-connectivity is seen as a key enabler in improving the performance of these next-generation networks. It enhances the system performance by providing multiple simultaneous links between the user equipment (UE) and the base stations (BS) for data transfer. Also, it eliminates the time needed to deal with frequent handoffs, link establishment, etc. Balancing the trade-offs among handoff rate, service delay, and achievable coverage/data rate in heterogeneous, dense, and diverse 5G cellular networks is, therefore, an open challenge. Hence, in this dissertation, we analyze how mobility impacts the performance of current Ultra-dense mmWave network (UDN) architecture in a city environment and discuss improvements for reducing the impact of mobility to meet 5G specifications using multi-connectivity. Current handover protocols, by design, suffer from interruption even if they are successful and, at the same time, carry the risk of failures during execution. The next-generation wireless networks, like 5G New Radio, introduce even stricter requirements that cannot be fulfilled with the traditional hard handover concept. Another expectation from these services is extreme reliability that will not tolerate any mobility-related failures. Thus, in this dissertation, we explore a novel technique using packet duplication and evaluate its performance under various mobility scenarios. We study how packet duplication can be used to meet the stringent reliability and latency requirements of modern cellular networks as data packets are duplicated and transmitted concurrently over two independent links. The idea is to generate multiple instances (duplicates) of a packet and transmit them simultaneously over different uncorrelated channels with the aim of reducing the packet failure probability. We also propose enhancements to the packet duplication feature to improve radio resource utilization. The wide variety of use cases in the 5G greatly differs from the use cases considered during the design of third-generation (3G) and fourth-generation (4G) long-term evolution (LTE) networks. Applications like autonomous driving, IoT applications, live video, etc., are much more uplink intensive as compared to traditional applications. However, the uplink performance is often, by design, lower than the downlink; hence, 5G must improve uplink performance. Hence, to meet the expected performance levels, there is a need to explore flexible network architectures for 5G networks. In this work, we propose a novel uplink scheme where the UE performs only a single transmission on a common channel, and every base station that can receive this signal would accept and process it. In our proposed architecture, a UE is connected to multiple mmWave capable distributed units (DUs), which are connected to a single gNB-central unit. In an ultra-dense deployment with multiple mmWave base stations around the UE, this removes the need to perform frequent handovers and allows high mobility with reduced latency. We develop and evaluate the performance of such a system for high throughput and reliable low latency communication under various mobility scenarios. To study the impact of mobility on next-generation networks, this work develops and systematically analyzes the performance of the 5G networks under mobility. We also look into the effect of increasing the number of users being served on the network. As a result, these studies are intended to understand better the network requirements for handling mobility and network load with multi-connectivity. This dissertation aims to achieve clarity and also proposes solutions for resolving these real-world network mobility issues

    Cellular, Wide-Area, and Non-Terrestrial IoT: A Survey on 5G Advances and the Road Towards 6G

    Full text link
    The next wave of wireless technologies is proliferating in connecting things among themselves as well as to humans. In the era of the Internet of things (IoT), billions of sensors, machines, vehicles, drones, and robots will be connected, making the world around us smarter. The IoT will encompass devices that must wirelessly communicate a diverse set of data gathered from the environment for myriad new applications. The ultimate goal is to extract insights from this data and develop solutions that improve quality of life and generate new revenue. Providing large-scale, long-lasting, reliable, and near real-time connectivity is the major challenge in enabling a smart connected world. This paper provides a comprehensive survey on existing and emerging communication solutions for serving IoT applications in the context of cellular, wide-area, as well as non-terrestrial networks. Specifically, wireless technology enhancements for providing IoT access in fifth-generation (5G) and beyond cellular networks, and communication networks over the unlicensed spectrum are presented. Aligned with the main key performance indicators of 5G and beyond 5G networks, we investigate solutions and standards that enable energy efficiency, reliability, low latency, and scalability (connection density) of current and future IoT networks. The solutions include grant-free access and channel coding for short-packet communications, non-orthogonal multiple access, and on-device intelligence. Further, a vision of new paradigm shifts in communication networks in the 2030s is provided, and the integration of the associated new technologies like artificial intelligence, non-terrestrial networks, and new spectra is elaborated. Finally, future research directions toward beyond 5G IoT networks are pointed out.Comment: Submitted for review to IEEE CS&
    corecore