161 research outputs found

    Enabling LTE RACH Collision Multiplicity Detection via Machine Learning

    Full text link
    The collision resolution mechanism in the Random Access Channel (RACH) procedure of the Long-Term Evolution (LTE) standard is known to represent a serious bottleneck in case of machine-type traffic. Its main drawbacks are seen in the facts that Base Stations (eNBs) typically cannot infer the number of collided User Equipments (UEs) and that collided UEs learn about the collision only implicitly, through the lack of the feedback in the later stage of the RACH procedure. The collided UEs then restart the procedure, thereby increasing the RACH load and making the system more prone to collisions. In this paper, we leverage machine learning techniques to design a system that outperforms the state-of-the-art schemes in preamble detection for the LTE RACH procedure. Most importantly, our scheme can also estimate the collision multiplicity, and thus gather information about how many devices chose the same preamble. This data can be used by the eNB to resolve collisions, increase the supported system load and reduce transmission latency. The presented approach is applicable to novel 3GPP standards that target massive IoT, e.g., LTE-M and NB-IoT.Comment: Submitted to IEEE GLOBECOM 201

    Design and analysis of LTE and wi-fi schemes for communications of massive machine devices

    Get PDF
    Existing communication technologies are designed with speciÿc use cases in mind, however, ex-tending these use cases usually throw up interesting challenges. For example, extending the use of existing cellular networks to emerging applications such as Internet of Things (IoT) devices throws up the challenge of handling massive number of devices. In this thesis, we are motivated to investigate existing schemes used in LTE and Wi-Fi for supporting massive machine devices and improve on observed performance gaps by designing new ones that outperform the former. This thesis investigates the existing random access protocol in LTE and proposes three schemes to combat massive device access challenge. The ÿrst is a root index reuse and allocation scheme which uses link budget calculations in extracting a safe distance for preamble reuse under vari-able cell size and also proposes an index allocation algorithm. Secondly, a dynamic subframe optimization scheme that combats the challenge from an optimisation solution perspective. Thirdly, the use of small cells for random access. Simulation and numerical analysis shows performance improvements against existing schemes in terms of throughput, access delay and probability of collision. In some cases, over 20% increase in performance was observed. The proposed schemes provide quicker and more guaranteed opportunities for machine devices to communicate. Also, in Wi-Fi networks, adaptation of the transmission rates to the dynamic channel condi-tions is a major challenge. Two algorithms were proposed to combat this. The ÿrst makes use of contextual information to determine the network state and respond appropriately whilst the second samples candidate transmission modes and uses the e˛ective throughput to make a deci-sion. The proposed algorithms were compared to several existing rate adaptation algorithms by simulations and under various system and channel conÿgurations. They show signiÿcant per-formance improvements, in terms of throughput, thus, conÿrming their suitability for dynamic channel conditions

    Random Access Analysis for Massive IoT Networks Under a New Spatio-Temporal Model: A Stochastic Geometry Approach

    Get PDF
    Massive Internet of Things (mIoT) has provided an auspicious opportunity to build powerful and ubiquitous connections that faces a plethora of new challenges, where cellular networks are potential solutions due to their high scalability, reliability, and efficiency. The Random Access CHannel (RACH) procedure is the first step of connection establishment between IoT devices and Base Stations (BSs) in the cellular-based mIoT network, where modelling the interactions between static properties of physical layer network and dynamic properties of queue evolving in each IoT device are challenging. To tackle this, we provide a novel traffic-aware spatio-temporal model to analyze RACH in cellular-based mIoT networks, where the physical layer network is modelled and analyzed based on stochastic geometry in the spatial domain, and the queue evolution is analyzed based on probability theory in the time domain. For performance evaluation, we derive the exact expressions for the preamble transmission success probabilities of a randomly chosen IoT device with different RACH schemes in each time slot, which offer insights into effectiveness of each RACH scheme. Our derived analytical results are verified by the realistic simulations capturing the evolution of packets in each IoT device. This mathematical model and analytical framework can be applied to evaluate the performance of other types of RACH schemes in the cellular-based networks by simply integrating its preamble transmission principle

    Cellular networks for smart grid communication

    Get PDF
    The next-generation electric power system, known as smart grid, relies on a robust and reliable underlying communication infrastructure to improve the efficiency of electricity distribution. Cellular networks, e.g., LTE/LTE-A systems, appear as a promising technology to facilitate the smart grid evolution. Their inherent performance characteristics and well-established ecosystem could potentially unlock unprecedented use cases, enabling real-time and autonomous distribution grid operations. However, cellular technology was not originally intended for smart grid communication, associated with highly-reliable message exchange and massive device connectivity requirements. The fundamental differences between smart grid and human-type communication challenge the classical design of cellular networks and introduce important research questions that have not been sufficiently addressed so far. Motivated by these challenges, this doctoral thesis investigates novel radio access network (RAN) design principles and performance analysis for the seamless integration of smart grid traffic in future cellular networks. Specifically, we focus on addressing the fundamental RAN problems of network scalability in massive smart grid deployments and radio resource management for smart grid and human-type traffic. The main objective of the thesis lies on the design, analysis and performance evaluation of RAN mechanisms that would render cellular networks the key enabler for emerging smart grid applications. The first part of the thesis addresses the radio access limitations in LTE-based networks for reliable and scalable smart grid communication. We first identify the congestion problem in LTE random access that arises in large-scale smart grid deployments. To overcome this, a novel random access mechanism is proposed that can efficiently support real-time distribution automation services with negligible impact on the background traffic. Motivated by the stringent reliability requirements of various smart grid operations, we then develop an analytical model of the LTE random access procedure that allows us to assess the performance of event-based monitoring traffic under various load conditions and network configurations. We further extend our analysis to include the relation between the cell size and the availability of orthogonal random access resources and we identify an additional challenge for reliable smart grid connectivity. To this end, we devise an interference- and load-aware cell planning mechanism that enhances reliability in substation automation services. Finally, we couple the problem of state estimation in wide-area monitoring systems with the reliability challenges in information acquisition. Using our developed analytical framework, we quantify the impact of imperfect communication reliability in the state estimation accuracy and we provide useful insights for the design of reliability-aware state estimators. The second part of the thesis builds on the previous one and focuses on the RAN problem of resource scheduling and sharing for smart grid and human-type traffic. We introduce a novel scheduler that achieves low latency for distribution automation traffic while resource allocation is performed in a way that keeps the degradation of cellular users at a minimum level. In addition, we investigate the benefits of Device-to-Device (D2D) transmission mode for event-based message exchange in substation automation scenarios. We design a joint mode selection and resource allocation mechanism which results in higher data rates with respect to the conventional transmission mode via the base station. An orthogonal resource partition scheme between cellular and D2D links is further proposed to prevent the underutilization of the scarce cellular spectrum. The research findings of this thesis aim to deliver novel solutions to important RAN performance issues that arise when cellular networks support smart grid communication.Las redes celulares, p.e., los sistemas LTE/LTE-A, aparecen como una tecnología prometedora para facilitar la evolución de la próxima generación del sistema eléctrico de potencia, conocido como smart grid (SG). Sin embargo, la tecnología celular no fue pensada originalmente para las comunicaciones en la SG, asociadas con el intercambio fiable de mensajes y con requisitos de conectividad de un número masivo de dispositivos. Las diferencias fundamentales entre las comunicaciones en la SG y la comunicación de tipo humano desafían el diseño clásico de las redes celulares e introducen importantes cuestiones de investigación que hasta ahora no se han abordado suficientemente. Motivada por estos retos, esta tesis doctoral investiga los principios de diseño y analiza el rendimiento de una nueva red de acceso radio (RAN) que permita una integración perfecta del tráfico de la SG en las redes celulares futuras. Nos centramos en los problemas fundamentales de escalabilidad de la RAN en despliegues de SG masivos, y en la gestión de los recursos radio para la integración del tráfico de la SG con el tráfico de tipo humano. El objetivo principal de la tesis consiste en el diseño, el análisis y la evaluación del rendimiento de los mecanismos de las RAN que convertirán a las redes celulares en el elemento clave para las aplicaciones emergentes de las SGs. La primera parte de la tesis aborda las limitaciones del acceso radio en redes LTE para la comunicación fiable y escalable en SGs. En primer lugar, identificamos el problema de congestión en el acceso aleatorio de LTE que aparece en los despliegues de SGs a gran escala. Para superar este problema, se propone un nuevo mecanismo de acceso aleatorio que permite soportar de forma eficiente los servicios de automatización de la distribución eléctrica en tiempo real, con un impacto insignificante en el tráfico de fondo. Motivados por los estrictos requisitos de fiabilidad de las diversas operaciones en la SG, desarrollamos un modelo analítico del procedimiento de acceso aleatorio de LTE que nos permite evaluar el rendimiento del tráfico de monitorización de la red eléctrica basado en eventos bajo diversas condiciones de carga y configuraciones de red. Además, ampliamos nuestro análisis para incluir la relación entre el tamaño de celda y la disponibilidad de recursos de acceso aleatorio ortogonales, e identificamos un reto adicional para la conectividad fiable en la SG. Con este fin, diseñamos un mecanismo de planificación celular que tiene en cuenta las interferencias y la carga de la red, y que mejora la fiabilidad en los servicios de automatización de las subestaciones eléctricas. Finalmente, combinamos el problema de la estimación de estado en sistemas de monitorización de redes eléctricas de área amplia con los retos de fiabilidad en la adquisición de la información. Utilizando el modelo analítico desarrollado, cuantificamos el impacto de la baja fiabilidad en las comunicaciones sobre la precisión de la estimación de estado. La segunda parte de la tesis se centra en el problema de scheduling y compartición de recursos en la RAN para el tráfico de SG y el tráfico de tipo humano. Presentamos un nuevo scheduler que proporciona baja latencia para el tráfico de automatización de la distribución eléctrica, mientras que la asignación de recursos se realiza de un modo que mantiene la degradación de los usuarios celulares en un nivel mínimo. Además, investigamos los beneficios del modo de transmisión Device-to-Device (D2D) en el intercambio de mensajes basados en eventos en escenarios de automatización de subestaciones eléctricas. Diseñamos un mecanismo conjunto de asignación de recursos y selección de modo que da como resultado tasas de datos más elevadas con respecto al modo de transmisión convencional a través de la estación base. Finalmente, se propone un esquema de partición de recursos ortogonales entre enlaces celulares y D2Postprint (published version

    Clock Error Impact on NB-IoT Radio Link Performance

    Get PDF
    3GPP has recently addressed the improvements in Random Access Network (RAN) and specified some new technologies such as enhanced Machine Type Communication (eMTC) and Narrow Band – Internet of Things (NB-IoT) in its release 13 which is also known as LTE-Advanced Pro. These new technologies are addressed mainly to focus on development and deployment of cellular IoT services. NB-IoT is less complex and easily deployable through software upgradation and is compatible to legacy cellular networks such as GSM and 4G which makes it a suitable candidate for IoT. NB-IoT will greatly support LPWAN, thus, it can be deployed for Smart cities and other fields such as smart electricity, smart agriculture, smart health services and smart homes. The NB-IoT targets for low cost device, low power consumption, relaxed delay sensitivity and easy deployment which will greatly support above mentioned fields. This thesis work studies the clock error impact on the radio link performance for up-link transmission on the NB-IoT testbed based on Cloud-RAN using Software Defined Radios (SDR) on a LTE protocol stack. The external clock error is introduced to the network and performance issues are analyzed in the radio link. The analysis indicates packet drops up to 51% in the radio link through the study of received power, packet loss, retransmissions, BLER and SINR for different MCS index. The major performance issues depicted by the analysis are packet loss up to 51% and retransmission of packets up to 128 times for lower SINR and high clock errors. Also, clock errors produce CFO up to 1.25 ppm which results in bad synchronization between UE and eNodeB

    FBMC-based random access signal design and detection for LEO base stations

    Get PDF
    The integration of non-terrestrial networks into the 5G ecosystem is mainly driven by the possibility of provisioning service in remote areas. In this context, the advent of flying base stations at the low Earth orbit (LEO) will enable anywhere and anytime connectivity. To materialize this vision, it is of utmost importance to improve radio protocols with the aim of allowing direct satellite access. Bearing this aspect in mind, we present a new random access signal, which is based on the filter bank multicarrier (FBMC) waveform, and a computationally efficient detection scheme. The proposed solution outperforms the standardized access scheme based on single-carrier frequency division multiplexing (SC-FDM), by reducing out-of-band (OOB) emissions and reducing the missed detection probability in presence of very high carrier frequency offset (CFO), which is inherent to LEO satellite systems. The improvement is related to the fine frequency resolution of the detector and the use of pulse shaping techniques. Interestingly, the FBMC-based random access signal achieves a high level of commonality with 5G new radio, as the preamble generation method and the time-frequency allocation pattern can be kept unchanged. Concerning the practical implementation aspects, the complexity of the detector is similar in both SC-FDM and FBMC.This paper is part of the R+D+i project (PID2020-115323RB-C31) funded by MCIN/AEI/ 10.13039/501100011033. This work is supported by the grant from Spanish Ministry of Economic Affairs and Digital Transformation and the European union - NextGenerationEU (UNICO-5G I+D/AROMA3D-Space (TSI-063000-2021-70))Peer ReviewedPostprint (author's final draft

    On Novel Access and Scheduling Schemes for IoT Communications

    Get PDF

    Cellular, Wide-Area, and Non-Terrestrial IoT: A Survey on 5G Advances and the Road Towards 6G

    Full text link
    The next wave of wireless technologies is proliferating in connecting things among themselves as well as to humans. In the era of the Internet of things (IoT), billions of sensors, machines, vehicles, drones, and robots will be connected, making the world around us smarter. The IoT will encompass devices that must wirelessly communicate a diverse set of data gathered from the environment for myriad new applications. The ultimate goal is to extract insights from this data and develop solutions that improve quality of life and generate new revenue. Providing large-scale, long-lasting, reliable, and near real-time connectivity is the major challenge in enabling a smart connected world. This paper provides a comprehensive survey on existing and emerging communication solutions for serving IoT applications in the context of cellular, wide-area, as well as non-terrestrial networks. Specifically, wireless technology enhancements for providing IoT access in fifth-generation (5G) and beyond cellular networks, and communication networks over the unlicensed spectrum are presented. Aligned with the main key performance indicators of 5G and beyond 5G networks, we investigate solutions and standards that enable energy efficiency, reliability, low latency, and scalability (connection density) of current and future IoT networks. The solutions include grant-free access and channel coding for short-packet communications, non-orthogonal multiple access, and on-device intelligence. Further, a vision of new paradigm shifts in communication networks in the 2030s is provided, and the integration of the associated new technologies like artificial intelligence, non-terrestrial networks, and new spectra is elaborated. Finally, future research directions toward beyond 5G IoT networks are pointed out.Comment: Submitted for review to IEEE CS&

    Non-Orthogonal Signal and System Design for Wireless Communications

    Get PDF
    The thesis presents research in non-orthogonal multi-carrier signals, in which: (i) a new signal format termed truncated orthogonal frequency division multiplexing (TOFDM) is proposed to improve data rates in wireless communication systems, such as those used in mobile/cellular systems and wireless local area networks (LANs), and (ii) a new design and experimental implementation of a real-time spectrally efficient frequency division multiplexing (SEFDM) system are reported. This research proposes a modified version of the orthogonal frequency division multiplexing (OFDM) format, obtained by truncating OFDM symbols in the time-domain. In TOFDM, subcarriers are no longer orthogonally packed in the frequency-domain as time samples are only partially transmitted, leading to improved spectral efficiency. In this work, (i) analytical expressions are derived for the newly proposed TOFDM signal, followed by (ii) interference analysis, (iii) systems design for uncoded and coded schemes, (iv) experimental implementation and (v) performance evaluation of the new proposed signal and system, with comparisons to conventional OFDM systems. Results indicate that signals can be recovered with truncated symbol transmission. Based on the TOFDM principle, a new receiving technique, termed partial symbol recovery (PSR), is designed and implemented in software de ned radio (SDR), that allows efficient operation of two users for overlapping data, in wireless communication systems operating with collisions. The PSR technique is based on recovery of collision-free partial OFDM symbols, followed by the reconstruction of complete symbols to recover progressively the frames of two users suffering collisions. The system is evaluated in a testbed of 12-nodes using SDR platforms. The thesis also proposes channel estimation and equalization technique for non-orthogonal signals in 5G scenarios, using an orthogonal demodulator and zero padding. Finally, the implementation of complete SEFDM systems in real-time is investigated and described in detail
    corecore