21 research outputs found

    On the Fundamental Limits of Random Non-orthogonal Multiple Access in Cellular Massive IoT

    Get PDF
    Machine-to-machine (M2M) constitutes the communication paradigm at the basis of Internet of Things (IoT) vision. M2M solutions allow billions of multi-role devices to communicate with each other or with the underlying data transport infrastructure without, or with minimal, human intervention. Current solutions for wireless transmissions originally designed for human-based applications thus require a substantial shift to cope with the capacity issues in managing a huge amount of M2M devices. In this paper, we consider the multiple access techniques as promising solutions to support a large number of devices in cellular systems with limited radio resources. We focus on non-orthogonal multiple access (NOMA) where, with the aim to increase the channel efficiency, the devices share the same radio resources for their data transmission. This has been shown to provide optimal throughput from an information theoretic point of view.We consider a realistic system model and characterise the system performance in terms of throughput and energy efficiency in a NOMA scenario with a random packet arrival model, where we also derive the stability condition for the system to guarantee the performance.Comment: To appear in IEEE JSAC Special Issue on Non-Orthogonal Multiple Access for 5G System

    Versatility Of Low-Power Wide-Area Network Applications

    Get PDF
    Low-Power Wide-Area Network (LPWAN) is regarded as the leading communication technology for wide-area Internet-of-Things (IoT) applications. It offers low-power, long-range, and low-cost communication. With different communication requirements for varying IoT applications, many competing LPWAN technologies operating in both licensed (e.g., NB-IoT, LTE-M, and 5G) and unlicensed (e.g., LoRa and SigFox) bands have emerged. LPWANs are designed to support applications with low-power and low data rate operations. They are not well-designed to host applications that involve high mobility, high traffic, or real-time communication (e.g., volcano monitoring and control applications).With the increasing number of mobile devices in many IoT domains (e.g., agricultural IoT and smart city), mobility support is not well-addressed in LPWAN. Cellular-based/licensed LPWAN relies on the wired infrastructure to enable mobility. On the other hand, most unlicensed LPWANs operate on the crowded ISM band or are required to duty cycle, making handling mobility a challenge. In this dissertation, we first identify the key opportunities of LPWAN, highlight the challenges, and show potential directions for future research. We then enable the versatility of LPWAN applications first by enabling applications involving mobility over LPWAN. Specifically, we propose to handle mobility in LPWAN over white space considering Sensor Network Over White Space (SNOW). SNOW is a highly scalable and energy-efficient LPWAN operating over the TV white spaces. TV white spaces are the allocated but locally unused available TV channels (54 - 698 MHz in the US). We proposed a dynamic Carrier Frequency Offset (CFO) estimation and compensation technique that considers the impact of the Doppler shift due to mobility. Also, we design energy-efficient and fast BS discovery and association approaches. Finally, we demonstrate the feasibility of our approach through experiments in different deployments. Finally, we present a collision detection and recovery technique called RnR (Reverse & Replace Decoding) that applies to LPWANs. Additionally, we discuss future work to enable handling burst transmission over LPWAN and localization in mobile LPWAN

    NB-IoT via non terrestrial networks

    Get PDF
    Massive Internet of Things is expected to play a crucial role in Beyond 5G (B5G) wireless communication systems, offering seamless connectivity among heterogeneous devices without human intervention. However, the exponential proliferation of smart devices and IoT networks, relying solely on terrestrial networks, may not fully meet the demanding IoT requirements in terms of bandwidth and connectivity, especially in areas where terrestrial infrastructures are not economically viable. To unleash the full potential of 5G and B5G networks and enable seamless connectivity everywhere, the 3GPP envisions the integration of Non-Terrestrial Networks (NTNs) into the terrestrial ones starting from Release 17. However, this integration process requires modifications to the 5G standard to ensure reliable communications despite typical satellite channel impairments. In this framework, this thesis aims at proposing techniques at the Physical and Medium Access Control layers that require minimal adaptations in the current NB-IoT standard via NTN. Thus, firstly the satellite impairments are evaluated and, then, a detailed link budget analysis is provided. Following, analyses at the link and the system levels are conducted. In the former case, a novel algorithm leveraging time-frequency analysis is proposed to detect orthogonal preambles and estimate the signals’ arrival time. Besides, the effects of collisions on the detection probability and Bit Error Rate are investigated and Non-Orthogonal Multiple Access approaches are proposed in the random access and data phases. The system analysis evaluates the performance of random access in case of congestion. Various access parameters are tested in different satellite scenarios, and the performance is measured in terms of access probability and time required to complete the procedure. Finally, a heuristic algorithm is proposed to jointly design the access and data phases, determining the number of satellite passages, the Random Access Periodicity, and the number of uplink repetitions that maximize the system's spectral efficiency

    NB-IoT via LEO satellites: An efficient resource allocation strategy for uplink data transmission

    Get PDF
    In this paper, we focus on the use of Low-Eart Orbit (LEO) satellites providing the Narrowband Internet of Things (NB-IoT) connectivity to the on-ground user equipment (UEs). Conventional resource allocation algorithms for the NBIoT systems are particularly designed for terrestrial infrastructures, where devices are under the coverage of a specific base station and the whole system varies very slowly in time. The existing methods in the literature cannot be applied over LEO satellite-based NB-IoT systems for several reasons. First, with the movement of the LEO satellite, the corresponding channel parameters for each user will quickly change over time. Delaying the scheduling of a certain user would result in a resource allocation based on outdated parameters. Second, the differential Doppler shift, which is a typical impairment in communications over LEO, directly depends on the relative distance among users. Scheduling at the same radio frame users that overcome a certain distance would violate the differential Doppler limit supported by the NB-IoT standard. Third, the propagation delay over a LEO satellite channel is around 4-16 times higher compared to a terrestrial system, imposing the need for message exchange minimization between the users and the base station. In this work, we propose a novel uplink resource allocation strategy that jointly incorporates the new design considerations previously mentioned together with the distinct channel conditions, satellite coverage times and data demands of various users on Earth. The novel methodology proposed in this paper can act as a framework for future works in the field.Comment: Tis work has been submitted to the IEEE IoT Journal for possible publication. Copyright may be transferred without notice, after which this version may no longer be accessibl

    On the Support of Massive Machine-to-Machine Traffic in Heterogeneous Networks and Fifth-Generation Cellular Networks

    Get PDF
    The widespread availability of many emerging services enabled by the Internet of Things (IoT) paradigm passes through the capability to provide long-range connectivity to a massive number of things, overcoming the well-known issues of ad-hoc, short-range networks. This scenario entails a lot of challenges, ranging from the concerns about the radio access network efficiency to the threats about the security of IoT networks. In this thesis, we will focus on wireless communication standards for long-range IoT as well as on fundamental research outcomes about IoT networks. After investigating how Machine-Type Communication (MTC) is supported nowadays, we will provide innovative solutions that i) satisfy the requirements in terms of scalability and latency, ii) employ a combination of licensed and license-free frequency bands, and iii) assure energy-efficiency and security

    Towards efficient support for massive Internet of Things over cellular networks

    Get PDF
    The usage of Internet of Things (IoT) devices over cellular networks is seeing tremendous growth in recent years, and that growth in only expected to increase in the near future. While existing 4G and 5G cellular networks offer several desirable features for this type of applications, their design has historically focused on accommodating traditional mobile devices (e.g. smartphones). As IoT devices have very different characteristics and use cases, they create a range of problems to current networks which often struggle to accommodate them at scale. Although newer cellular network technologies, such as Narrowband-IoT (NB-IoT), were designed to focus on the IoT characteristics, they were extensively based on 4G and 5G networks to preserve interoperability, and decrease their deployment cost. As such, several inefficiencies of 4G/5G were also carried over to the newer technologies. This thesis focuses on identifying the core issues that hinder the large scale deployment of IoT over cellular networks, and proposes novel protocols to largely alleviate them. We find that the most significant challenges arise mainly in three distinct areas: connection establishment, network resource utilisation and device energy efficiency. Specifically, we make the following contributions. First, we focus on the connection establishment process and argue that the current procedures, when used by IoT devices, result in increased numbers of collisions, network outages and a signalling overhead that is disproportionate to the size of the data transmitted, and the connection duration of IoT devices. Therefore, we propose two mechanisms to alleviate these inefficiencies. Our first mechanism, named ASPIS, focuses on both the number of collisions and the signalling overhead simultaneously, and provides enhancements to increase the number of successful IoT connections, without disrupting existing background traffic. Our second mechanism focuses specifically on the collisions at the connection establishment process, and used a novel approach with Reinforcement Learning, to decrease their number and allow a larger number of IoT devices to access the network with fewer attempts. Second, we propose a new multicasting mechanism to reduce network resource utilisation in NB-IoT networks, by delivering common content (e.g. firmware updates) to multiple similar devices simultaneously. Notably, our mechanism is both more efficient during multicast data transmission, but also frees up resources that would otherwise be perpetually reserved for multicast signalling under the existing scheme. Finally, we focus on energy efficiency and propose novel protocols that are designed for the unique usage characteristics of NB-IoT devices, in order to reduce the device power consumption. Towards this end, we perform a detailed energy consumption analysis, which we use as a basis to develop an energy consumption model for realistic energy consumption assessment. We then take the insights from our analysis, and propose optimisations to significantly reduce the energy consumption of IoT devices, and assess their performance

    Cellular networks for smart grid communication

    Get PDF
    The next-generation electric power system, known as smart grid, relies on a robust and reliable underlying communication infrastructure to improve the efficiency of electricity distribution. Cellular networks, e.g., LTE/LTE-A systems, appear as a promising technology to facilitate the smart grid evolution. Their inherent performance characteristics and well-established ecosystem could potentially unlock unprecedented use cases, enabling real-time and autonomous distribution grid operations. However, cellular technology was not originally intended for smart grid communication, associated with highly-reliable message exchange and massive device connectivity requirements. The fundamental differences between smart grid and human-type communication challenge the classical design of cellular networks and introduce important research questions that have not been sufficiently addressed so far. Motivated by these challenges, this doctoral thesis investigates novel radio access network (RAN) design principles and performance analysis for the seamless integration of smart grid traffic in future cellular networks. Specifically, we focus on addressing the fundamental RAN problems of network scalability in massive smart grid deployments and radio resource management for smart grid and human-type traffic. The main objective of the thesis lies on the design, analysis and performance evaluation of RAN mechanisms that would render cellular networks the key enabler for emerging smart grid applications. The first part of the thesis addresses the radio access limitations in LTE-based networks for reliable and scalable smart grid communication. We first identify the congestion problem in LTE random access that arises in large-scale smart grid deployments. To overcome this, a novel random access mechanism is proposed that can efficiently support real-time distribution automation services with negligible impact on the background traffic. Motivated by the stringent reliability requirements of various smart grid operations, we then develop an analytical model of the LTE random access procedure that allows us to assess the performance of event-based monitoring traffic under various load conditions and network configurations. We further extend our analysis to include the relation between the cell size and the availability of orthogonal random access resources and we identify an additional challenge for reliable smart grid connectivity. To this end, we devise an interference- and load-aware cell planning mechanism that enhances reliability in substation automation services. Finally, we couple the problem of state estimation in wide-area monitoring systems with the reliability challenges in information acquisition. Using our developed analytical framework, we quantify the impact of imperfect communication reliability in the state estimation accuracy and we provide useful insights for the design of reliability-aware state estimators. The second part of the thesis builds on the previous one and focuses on the RAN problem of resource scheduling and sharing for smart grid and human-type traffic. We introduce a novel scheduler that achieves low latency for distribution automation traffic while resource allocation is performed in a way that keeps the degradation of cellular users at a minimum level. In addition, we investigate the benefits of Device-to-Device (D2D) transmission mode for event-based message exchange in substation automation scenarios. We design a joint mode selection and resource allocation mechanism which results in higher data rates with respect to the conventional transmission mode via the base station. An orthogonal resource partition scheme between cellular and D2D links is further proposed to prevent the underutilization of the scarce cellular spectrum. The research findings of this thesis aim to deliver novel solutions to important RAN performance issues that arise when cellular networks support smart grid communication.Las redes celulares, p.e., los sistemas LTE/LTE-A, aparecen como una tecnología prometedora para facilitar la evolución de la próxima generación del sistema eléctrico de potencia, conocido como smart grid (SG). Sin embargo, la tecnología celular no fue pensada originalmente para las comunicaciones en la SG, asociadas con el intercambio fiable de mensajes y con requisitos de conectividad de un número masivo de dispositivos. Las diferencias fundamentales entre las comunicaciones en la SG y la comunicación de tipo humano desafían el diseño clásico de las redes celulares e introducen importantes cuestiones de investigación que hasta ahora no se han abordado suficientemente. Motivada por estos retos, esta tesis doctoral investiga los principios de diseño y analiza el rendimiento de una nueva red de acceso radio (RAN) que permita una integración perfecta del tráfico de la SG en las redes celulares futuras. Nos centramos en los problemas fundamentales de escalabilidad de la RAN en despliegues de SG masivos, y en la gestión de los recursos radio para la integración del tráfico de la SG con el tráfico de tipo humano. El objetivo principal de la tesis consiste en el diseño, el análisis y la evaluación del rendimiento de los mecanismos de las RAN que convertirán a las redes celulares en el elemento clave para las aplicaciones emergentes de las SGs. La primera parte de la tesis aborda las limitaciones del acceso radio en redes LTE para la comunicación fiable y escalable en SGs. En primer lugar, identificamos el problema de congestión en el acceso aleatorio de LTE que aparece en los despliegues de SGs a gran escala. Para superar este problema, se propone un nuevo mecanismo de acceso aleatorio que permite soportar de forma eficiente los servicios de automatización de la distribución eléctrica en tiempo real, con un impacto insignificante en el tráfico de fondo. Motivados por los estrictos requisitos de fiabilidad de las diversas operaciones en la SG, desarrollamos un modelo analítico del procedimiento de acceso aleatorio de LTE que nos permite evaluar el rendimiento del tráfico de monitorización de la red eléctrica basado en eventos bajo diversas condiciones de carga y configuraciones de red. Además, ampliamos nuestro análisis para incluir la relación entre el tamaño de celda y la disponibilidad de recursos de acceso aleatorio ortogonales, e identificamos un reto adicional para la conectividad fiable en la SG. Con este fin, diseñamos un mecanismo de planificación celular que tiene en cuenta las interferencias y la carga de la red, y que mejora la fiabilidad en los servicios de automatización de las subestaciones eléctricas. Finalmente, combinamos el problema de la estimación de estado en sistemas de monitorización de redes eléctricas de área amplia con los retos de fiabilidad en la adquisición de la información. Utilizando el modelo analítico desarrollado, cuantificamos el impacto de la baja fiabilidad en las comunicaciones sobre la precisión de la estimación de estado. La segunda parte de la tesis se centra en el problema de scheduling y compartición de recursos en la RAN para el tráfico de SG y el tráfico de tipo humano. Presentamos un nuevo scheduler que proporciona baja latencia para el tráfico de automatización de la distribución eléctrica, mientras que la asignación de recursos se realiza de un modo que mantiene la degradación de los usuarios celulares en un nivel mínimo. Además, investigamos los beneficios del modo de transmisión Device-to-Device (D2D) en el intercambio de mensajes basados en eventos en escenarios de automatización de subestaciones eléctricas. Diseñamos un mecanismo conjunto de asignación de recursos y selección de modo que da como resultado tasas de datos más elevadas con respecto al modo de transmisión convencional a través de la estación base. Finalmente, se propone un esquema de partición de recursos ortogonales entre enlaces celulares y D2Postprint (published version
    corecore