36 research outputs found

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial

    Cellular networks for smart grid communication

    Get PDF
    The next-generation electric power system, known as smart grid, relies on a robust and reliable underlying communication infrastructure to improve the efficiency of electricity distribution. Cellular networks, e.g., LTE/LTE-A systems, appear as a promising technology to facilitate the smart grid evolution. Their inherent performance characteristics and well-established ecosystem could potentially unlock unprecedented use cases, enabling real-time and autonomous distribution grid operations. However, cellular technology was not originally intended for smart grid communication, associated with highly-reliable message exchange and massive device connectivity requirements. The fundamental differences between smart grid and human-type communication challenge the classical design of cellular networks and introduce important research questions that have not been sufficiently addressed so far. Motivated by these challenges, this doctoral thesis investigates novel radio access network (RAN) design principles and performance analysis for the seamless integration of smart grid traffic in future cellular networks. Specifically, we focus on addressing the fundamental RAN problems of network scalability in massive smart grid deployments and radio resource management for smart grid and human-type traffic. The main objective of the thesis lies on the design, analysis and performance evaluation of RAN mechanisms that would render cellular networks the key enabler for emerging smart grid applications. The first part of the thesis addresses the radio access limitations in LTE-based networks for reliable and scalable smart grid communication. We first identify the congestion problem in LTE random access that arises in large-scale smart grid deployments. To overcome this, a novel random access mechanism is proposed that can efficiently support real-time distribution automation services with negligible impact on the background traffic. Motivated by the stringent reliability requirements of various smart grid operations, we then develop an analytical model of the LTE random access procedure that allows us to assess the performance of event-based monitoring traffic under various load conditions and network configurations. We further extend our analysis to include the relation between the cell size and the availability of orthogonal random access resources and we identify an additional challenge for reliable smart grid connectivity. To this end, we devise an interference- and load-aware cell planning mechanism that enhances reliability in substation automation services. Finally, we couple the problem of state estimation in wide-area monitoring systems with the reliability challenges in information acquisition. Using our developed analytical framework, we quantify the impact of imperfect communication reliability in the state estimation accuracy and we provide useful insights for the design of reliability-aware state estimators. The second part of the thesis builds on the previous one and focuses on the RAN problem of resource scheduling and sharing for smart grid and human-type traffic. We introduce a novel scheduler that achieves low latency for distribution automation traffic while resource allocation is performed in a way that keeps the degradation of cellular users at a minimum level. In addition, we investigate the benefits of Device-to-Device (D2D) transmission mode for event-based message exchange in substation automation scenarios. We design a joint mode selection and resource allocation mechanism which results in higher data rates with respect to the conventional transmission mode via the base station. An orthogonal resource partition scheme between cellular and D2D links is further proposed to prevent the underutilization of the scarce cellular spectrum. The research findings of this thesis aim to deliver novel solutions to important RAN performance issues that arise when cellular networks support smart grid communication.Las redes celulares, p.e., los sistemas LTE/LTE-A, aparecen como una tecnología prometedora para facilitar la evolución de la próxima generación del sistema eléctrico de potencia, conocido como smart grid (SG). Sin embargo, la tecnología celular no fue pensada originalmente para las comunicaciones en la SG, asociadas con el intercambio fiable de mensajes y con requisitos de conectividad de un número masivo de dispositivos. Las diferencias fundamentales entre las comunicaciones en la SG y la comunicación de tipo humano desafían el diseño clásico de las redes celulares e introducen importantes cuestiones de investigación que hasta ahora no se han abordado suficientemente. Motivada por estos retos, esta tesis doctoral investiga los principios de diseño y analiza el rendimiento de una nueva red de acceso radio (RAN) que permita una integración perfecta del tráfico de la SG en las redes celulares futuras. Nos centramos en los problemas fundamentales de escalabilidad de la RAN en despliegues de SG masivos, y en la gestión de los recursos radio para la integración del tráfico de la SG con el tráfico de tipo humano. El objetivo principal de la tesis consiste en el diseño, el análisis y la evaluación del rendimiento de los mecanismos de las RAN que convertirán a las redes celulares en el elemento clave para las aplicaciones emergentes de las SGs. La primera parte de la tesis aborda las limitaciones del acceso radio en redes LTE para la comunicación fiable y escalable en SGs. En primer lugar, identificamos el problema de congestión en el acceso aleatorio de LTE que aparece en los despliegues de SGs a gran escala. Para superar este problema, se propone un nuevo mecanismo de acceso aleatorio que permite soportar de forma eficiente los servicios de automatización de la distribución eléctrica en tiempo real, con un impacto insignificante en el tráfico de fondo. Motivados por los estrictos requisitos de fiabilidad de las diversas operaciones en la SG, desarrollamos un modelo analítico del procedimiento de acceso aleatorio de LTE que nos permite evaluar el rendimiento del tráfico de monitorización de la red eléctrica basado en eventos bajo diversas condiciones de carga y configuraciones de red. Además, ampliamos nuestro análisis para incluir la relación entre el tamaño de celda y la disponibilidad de recursos de acceso aleatorio ortogonales, e identificamos un reto adicional para la conectividad fiable en la SG. Con este fin, diseñamos un mecanismo de planificación celular que tiene en cuenta las interferencias y la carga de la red, y que mejora la fiabilidad en los servicios de automatización de las subestaciones eléctricas. Finalmente, combinamos el problema de la estimación de estado en sistemas de monitorización de redes eléctricas de área amplia con los retos de fiabilidad en la adquisición de la información. Utilizando el modelo analítico desarrollado, cuantificamos el impacto de la baja fiabilidad en las comunicaciones sobre la precisión de la estimación de estado. La segunda parte de la tesis se centra en el problema de scheduling y compartición de recursos en la RAN para el tráfico de SG y el tráfico de tipo humano. Presentamos un nuevo scheduler que proporciona baja latencia para el tráfico de automatización de la distribución eléctrica, mientras que la asignación de recursos se realiza de un modo que mantiene la degradación de los usuarios celulares en un nivel mínimo. Además, investigamos los beneficios del modo de transmisión Device-to-Device (D2D) en el intercambio de mensajes basados en eventos en escenarios de automatización de subestaciones eléctricas. Diseñamos un mecanismo conjunto de asignación de recursos y selección de modo que da como resultado tasas de datos más elevadas con respecto al modo de transmisión convencional a través de la estación base. Finalmente, se propone un esquema de partición de recursos ortogonales entre enlaces celulares y D2Postprint (published version

    Cellular, Wide-Area, and Non-Terrestrial IoT: A Survey on 5G Advances and the Road Towards 6G

    Full text link
    The next wave of wireless technologies is proliferating in connecting things among themselves as well as to humans. In the era of the Internet of things (IoT), billions of sensors, machines, vehicles, drones, and robots will be connected, making the world around us smarter. The IoT will encompass devices that must wirelessly communicate a diverse set of data gathered from the environment for myriad new applications. The ultimate goal is to extract insights from this data and develop solutions that improve quality of life and generate new revenue. Providing large-scale, long-lasting, reliable, and near real-time connectivity is the major challenge in enabling a smart connected world. This paper provides a comprehensive survey on existing and emerging communication solutions for serving IoT applications in the context of cellular, wide-area, as well as non-terrestrial networks. Specifically, wireless technology enhancements for providing IoT access in fifth-generation (5G) and beyond cellular networks, and communication networks over the unlicensed spectrum are presented. Aligned with the main key performance indicators of 5G and beyond 5G networks, we investigate solutions and standards that enable energy efficiency, reliability, low latency, and scalability (connection density) of current and future IoT networks. The solutions include grant-free access and channel coding for short-packet communications, non-orthogonal multiple access, and on-device intelligence. Further, a vision of new paradigm shifts in communication networks in the 2030s is provided, and the integration of the associated new technologies like artificial intelligence, non-terrestrial networks, and new spectra is elaborated. Finally, future research directions toward beyond 5G IoT networks are pointed out.Comment: Submitted for review to IEEE CS&

    Channel Access in Wireless Networks: Protocol Design of Energy-Aware Schemes for the IoT and Analysis of Existing Technologies

    Get PDF
    The design of channel access policies has been an object of study since the deployment of the first wireless networks, as the Medium Access Control (MAC) layer is responsible for coordinating transmissions to a shared channel and plays a key role in the network performance. While the original target was the system throughput, over the years the focus switched to communication latency, Quality of Service (QoS) guarantees, energy consumption, spectrum efficiency, and any combination of such goals. The basic mechanisms to use a shared channel, such as ALOHA, TDMA- and FDMA-based policies, have been introduced decades ago. Nonetheless, the continuous evolution of wireless networks and the emergence of new communication paradigms demand the development of new strategies to adapt and optimize the standard approaches so as to satisfy the requirements of applications and devices. This thesis proposes several channel access schemes for novel wireless technologies, in particular Internet of Things (IoT) networks, the Long-Term Evolution (LTE) cellular standard, and mmWave communication with the IEEE802.11ad standard. The first part of the thesis concerns energy-aware channel access policies for IoT networks, which typically include several battery-powered sensors. In scenarios with energy restrictions, traditional protocols that do not consider the energy consumption may lead to the premature death of the network and unreliable performance expectations. The proposed schemes show the importance of accurately characterizing all the sources of energy consumption (and inflow, in the case of energy harvesting), which need to be included in the protocol design. In particular, the schemes presented in this thesis exploit data processing and compression techniques to trade off QoS for lifetime. We investigate contention-free and contention-based chanel access policies for different scenarios and application requirements. While the energy-aware schemes proposed for IoT networks are based on a clean-slate approach that is agnostic of the communication technology used, the second part of the thesis is focused on the LTE and IEEE802.11ad standards. As regards LTE, the study proposed in this thesis shows how to use machine-learning techniques to infer the collision multiplicity in the channel access phase, information that can be used to understand when the network is congested and improve the contention resolution mechanism. This is especially useful for massive access scenarios; in the last years, in fact, the research community has been investigating on the use of LTE for Machine-Type Communication (MTC). As regards the standard IEEE802.11ad, instead, it provides a hybrid MAC layer with contention-based and contention-free scheduled allocations, and a dynamic channel time allocation mechanism built on top of such schedule. Although this hybrid scheme is expected to meet heterogeneous requirements, it is still not clear how to develop a schedule based on the various traffic flows and their demands. A mathematical model is necessary to understand the performance and limits of the possible types of allocations and guide the scheduling process. In this thesis, we propose a model for the contention-based access periods which is aware of the interleaving of the available channel time with contention-free allocations

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as enhanced Mobile Broadband (eMBB), massive Machine Type Communications (mMTC) and Ultra-Reliable and Low Latency Communications (URLLC), the mMTC brings the unique technical challenge of supporting a huge number of MTC devices in cellular networks, which is the main focus of this paper. The related challenges include Quality of Service (QoS) provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and Narrowband IoT (NB-IoT). Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenario along with the recent advances towards enhancing its learning performance and convergence. Finally, we discuss some open research challenges and promising future research directions

    Energy efficiency in short and wide-area IoT technologies—A survey

    Get PDF
    In the last years, the Internet of Things (IoT) has emerged as a key application context in the design and evolution of technologies in the transition toward a 5G ecosystem. More and more IoT technologies have entered the market and represent important enablers in the deployment of networks of interconnected devices. As network and spatial device densities grow, energy efficiency and consumption are becoming an important aspect in analyzing the performance and suitability of different technologies. In this framework, this survey presents an extensive review of IoT technologies, including both Low-Power Short-Area Networks (LPSANs) and Low-Power Wide-Area Networks (LPWANs), from the perspective of energy efficiency and power consumption. Existing consumption models and energy efficiency mechanisms are categorized, analyzed and discussed, in order to highlight the main trends proposed in literature and standards toward achieving energy-efficient IoT networks. Current limitations and open challenges are also discussed, aiming at highlighting new possible research directions

    Internet of Things and Sensors Networks in 5G Wireless Communications

    Get PDF
    This book is a printed edition of the Special Issue Internet of Things and Sensors Networks in 5G Wireless Communications that was published in Sensors

    Internet of Things and Sensors Networks in 5G Wireless Communications

    Get PDF
    The Internet of Things (IoT) has attracted much attention from society, industry and academia as a promising technology that can enhance day to day activities, and the creation of new business models, products and services, and serve as a broad source of research topics and ideas. A future digital society is envisioned, composed of numerous wireless connected sensors and devices. Driven by huge demand, the massive IoT (mIoT) or massive machine type communication (mMTC) has been identified as one of the three main communication scenarios for 5G. In addition to connectivity, computing and storage and data management are also long-standing issues for low-cost devices and sensors. The book is a collection of outstanding technical research and industrial papers covering new research results, with a wide range of features within the 5G-and-beyond framework. It provides a range of discussions of the major research challenges and achievements within this topic
    corecore