382 research outputs found

    On the Feasibility of Utilizing Commercial 4G LTE Systems for Misson-Critical IoT Applications

    Full text link
    Emerging Internet of Things (IoT) applications and services including e-healthcare, intelligent transportation systems, smart grid, and smart homes to smart cities to smart workplace, are poised to become part of every aspect of our daily lives. The IoT will enable billions of sensors, actuators, and smart devices to be interconnected and managed remotely via the Internet. Cellular-based Machine-to-Machine (M2M) communications is one of the key IoT enabling technologies with huge market potential for cellular service providers deploying Long Term Evolution (LTE) networks. There is an emerging consensus that Fourth Generation (4G) and 5G cellular technologies will enable and support these applications, as they will provide the global mobile connectivity to the anticipated tens of billions of things/devices that will be attached to the Internet. Many vital utilities and service industries are considering the use of commercially available LTE cellular networks to provide critical connections to users, sensors, and smart M2M devices on their networks, due to its low cost and availability. Many of these emerging IoT applications are mission-critical with stringent requirements in terms of reliability and end-to-end (E2E) delay bound. The delay bound specified for each application refers to the device-to-device latencies, which is defined as the combined delay resulting from both application level processing time and communication latency. Each IoT application has its own distinct performance requirements in terms of latency, availability, and reliability. Typically, uplink (UL) traffic of most of these IoT applications is the dominant network traffic (much higher than total downlink (DL) traffic). Thus, efficient LTE UL scheduling algorithms at the base station (“Evolved NodeB (eNB)” per 3GPP standards) are more critical for M2M applications. LTE, however, was not originally intended for IoT applications, where traffic generated by M2M devices (running IoT applications) has totally different characteristics than those from traditional Human-to-Human (H2H)-based voice/video and data communications. In addition, due to the anticipated massive deployment of M2M devices and the limited available radio spectrum, the problem of efficient radio resources management (RRM) and UL scheduling poses a serious challenge in adopting LTE for M2M communications. Existing LTE quality of service (QoS) standard and UL scheduling algorithms were mainly optimized for H2H services and can’t accommodate such a wide range of diverging performance requirements of these M2M-based IoT applications. Though 4G LTE networks can support very low Packet Loss Ratio (PLR) at the physical layer, such reliability, however, comes at the expense of increased latency from tens to hundreds of ms due to the aggressive use of retransmission mechanisms. Current 4G LTE technologies may satisfy a single performance metric of these mission critical applications, but not the simultaneous support of ultra-high reliability and low latency as well as high data rates. Numerous QoS aware LTE UL scheduling algorithms for supporting M2M applications as well as H2H services have been reported in the literature. Most of these algorithms, however, were not intended for the support of mission critical IoT applications, as they are not latency-aware. In addition, these algorithms are simplified and don’t fully conform to LTE’s signaling and QoS standards. For instance, a common practice is the assumption that the time domain UL scheduler located at the eNB prioritizes user equipment (UEs)/M2M devices connection requests based on the head-of-line (HOL) packet waiting time at the UE/device transmission buffer. However, as will be detailed below, LTE standard does not support a mechanism that enables the UEs/devices to inform the eNB uplink scheduler about the waiting time of uplink packets residing in their transmission buffers. Ultra-Reliable Low-Latency Communication (URLLC) paradigm has recently emerged to enable a new range of mission-critical applications and services including industrial automation, real-time operation and control of the smart grid, inter-vehicular communications for improved safety and self-deriving vehicles. URLLC is one of the most innovative 5G New Radio (NR) features. URLLC and its supporting 5G NR technologies might become a commercial reality in the future, but it may be rather a distant future. Thus, deploying viable mission critical IoT applications will have to be postponed until URLLC and 5G NR technologies are commercially feasible. Because IoT applications, specifically mission critical, will have a significant impact on the welfare of all humanity, the immediate or near-term deployments of these applications is of utmost importance. It is the purpose of this thesis to explore whether current commercial 4G LTE cellular networks have the potential to support some of the emerging mission critical IoT applications. Smart grid is selected in this work as an illustrative IoT example because it is one of the most demanding IoT applications, as it includes diverse use cases ranging from mission-critical applications that have stringent requirements in terms of E2E latency and reliability to those that require support of massive number of connected M2M devices with relaxed latency and reliability requirements. The purpose of thesis is two fold: First, a user-friendly MATLAB-based open source software package to model commercial 4G LTE systems is developed. In contrast to mainstream commercial LTE software packages, the developed package is specifically tailored to accurately model mission critical IoT applications and above all fully conforms to commercial 4G LTE signaling and QoS standards. Second, utilizing the developed software package, we present a detailed realistic LTE UL performance analysis to assess the feasibility of commercial 4G LTE cellular networks when used to support such a diverse set of emerging IoT applications as well as typical H2H services

    5G standalone network's reliability, one-way latency and packet loss rate analysis for URLLC implementation

    Get PDF
    5G is the fifth generation technology standard for cellular networks. It has three main application demands, which are Enhanced Mobile Broadband (EMBB), Massive Machine-Type Communications (MMTC) and Ultra-Reliable Low-Latency Communications (URLLC). URLLC is a very challenging demand to implement, with strict reliability and latency requirements. It has been highly specified by 2022 and 5G vendors are starting to implement basic URLLC features in the near future. The motivation for this thesis is to find ways to make measurements on how a 5G standalone (SA) network performs on key URLLC performance indicators, analyse and visualize these measurements, find reasons for certain network behavior and make estimates on what kind of impact different URLLC features will have when implemented. Furthermore, another motivation is to find a way to detect packet loss and reasons behind it, because packet loss impairs reliability significantly and should be minimized before deploying URLLC features. To measure 5G SA network's performance, four different kind of test cases were identified, in which URLLC type of network traffic is generated. There are static tests done in good coverage and bad coverage from the 5G cell, and mobility tests done by moving from good coverage to bad coverage while attached to the same 5G cell, and a handover test in which the 5G cell is changed. All tests are done in a 5G field verification environment, for both downlink and uplink. For downlink, coverage and mobility inside a cell did not have a meaningful impact to one-way latency. This was mainly because there was no need for packet retransmissions, which would have increased latency. This is promising especially for mobility URLLC use cases such as Vehicle-To-Everything communications (V2X). Uplink performed much weaker, mainly because of uplink resource scheduling and packet retransmissions. Handover was problematic for both downlink and uplink, because of the brief but massive increase in latency caused by the cell change. All packet loss in the measurements happened in uplink transmission, and this thesis includes a case study where different potential factors causing packet loss were consistently eliminated. In the end, the cause for packet loss indicates towards the 5G chipset used for the tests

    4G/5G cellular networks metrology and management

    Get PDF
    La prolifération d'applications et de services sophistiqués s'accompagne de diverses exigences de performances, ainsi que d'une croissance exponentielle du trafic pour le lien montant (uplink) et descendant (downlink). Les réseaux cellulaires tels que 4G et 5G évoluent pour prendre en charge cette quantité diversifiée et énorme de données. Le travail de cette thèse vise le renforcement de techniques avancées de gestion et supervision des réseaux cellulaires prenant l'explosion du trafic et sa diversité comme deux des principaux défis dans ces réseaux. La première contribution aborde l'intégration de l'intelligence dans les réseaux cellulaires via l'estimation du débit instantané sur le lien montant pour de petites granularités temporelles. Un banc d'essai 4G temps réel est déployé dans ce but de fournir un benchmark exhaustif des métriques de l'eNB. Des estimations précises sont ainsi obtenues. La deuxième contribution renforce le découpage 5G en temps réel au niveau des ressources radio dans un système multicellulaire. Pour cela, deux modèles d'optimisation ont été proposés. Du fait de leurs temps d'exécution trop long, des heuristiques ont été développées et évaluées en comparaisons des modèles optimaux. Les résultats sont prometteurs, les deux heuristiques renforçant fortement le découpage du RAN en temps réel.The proliferation of sophisticated applications and services comes with diverse performance requirements as well as an exponential traffic growth for both upload and download. The cellular networks such as 4G and 5G are advocated to support this diverse and huge amount of data. This thesis work targets the enforcement of advanced cellular network supervision and management techniques taking the traffic explosion and diversity as two main challenges in these networks. The first contribution tackles the intelligence integration in cellular networks through the estimation of users uplink instantaneous throughput at small time granularities. A real time 4G testbed is deployed for such aim with an exhaustive metrics benchmark. Accurate estimations are achieved.The second contribution enforces the real time 5G slicing from radio resources perspective in a multi-cell system. For that, two exact optimization models are proposed. Due to their high convergence time, heuristics are developed and evaluated with the optimal models. Results are promising, as two heuristics are highly enforcing the real time RAN slicing

    Cellular, Wide-Area, and Non-Terrestrial IoT: A Survey on 5G Advances and the Road Towards 6G

    Full text link
    The next wave of wireless technologies is proliferating in connecting things among themselves as well as to humans. In the era of the Internet of things (IoT), billions of sensors, machines, vehicles, drones, and robots will be connected, making the world around us smarter. The IoT will encompass devices that must wirelessly communicate a diverse set of data gathered from the environment for myriad new applications. The ultimate goal is to extract insights from this data and develop solutions that improve quality of life and generate new revenue. Providing large-scale, long-lasting, reliable, and near real-time connectivity is the major challenge in enabling a smart connected world. This paper provides a comprehensive survey on existing and emerging communication solutions for serving IoT applications in the context of cellular, wide-area, as well as non-terrestrial networks. Specifically, wireless technology enhancements for providing IoT access in fifth-generation (5G) and beyond cellular networks, and communication networks over the unlicensed spectrum are presented. Aligned with the main key performance indicators of 5G and beyond 5G networks, we investigate solutions and standards that enable energy efficiency, reliability, low latency, and scalability (connection density) of current and future IoT networks. The solutions include grant-free access and channel coding for short-packet communications, non-orthogonal multiple access, and on-device intelligence. Further, a vision of new paradigm shifts in communication networks in the 2030s is provided, and the integration of the associated new technologies like artificial intelligence, non-terrestrial networks, and new spectra is elaborated. Finally, future research directions toward beyond 5G IoT networks are pointed out.Comment: Submitted for review to IEEE CS&

    Internet of Things-aided Smart Grid: Technologies, Architectures, Applications, Prototypes, and Future Research Directions

    Full text link
    Traditional power grids are being transformed into Smart Grids (SGs) to address the issues in existing power system due to uni-directional information flow, energy wastage, growing energy demand, reliability and security. SGs offer bi-directional energy flow between service providers and consumers, involving power generation, transmission, distribution and utilization systems. SGs employ various devices for the monitoring, analysis and control of the grid, deployed at power plants, distribution centers and in consumers' premises in a very large number. Hence, an SG requires connectivity, automation and the tracking of such devices. This is achieved with the help of Internet of Things (IoT). IoT helps SG systems to support various network functions throughout the generation, transmission, distribution and consumption of energy by incorporating IoT devices (such as sensors, actuators and smart meters), as well as by providing the connectivity, automation and tracking for such devices. In this paper, we provide a comprehensive survey on IoT-aided SG systems, which includes the existing architectures, applications and prototypes of IoT-aided SG systems. This survey also highlights the open issues, challenges and future research directions for IoT-aided SG systems

    Towards UAV Assisted 5G Public Safety Network

    Get PDF
    Ensuring ubiquitous mission-critical public safety communications (PSC) to all the first responders in the public safety network is crucial at an emergency site. The first responders heavily rely on mission-critical PSC to save lives, property, and national infrastructure during a natural or human-made emergency. The recent advancements in LTE/LTE-Advanced/5G mobile technologies supported by unmanned aerial vehicles (UAV) have great potential to revolutionize PSC. However, limited spectrum allocation for LTE-based PSC demands improved channel capacity and spectral efficiency. An additional challenge in designing an LTE-based PSC network is achieving at least 95% coverage of the geographical area and human population with broadband rates. The coverage requirement and efficient spectrum use in the PSC network can be realized through the dense deployment of small cells (both terrestrial and aerial). However, there are several challenges with the dense deployment of small cells in an air-ground heterogeneous network (AG-HetNet). The main challenges which are addressed in this research work are integrating UAVs as both aerial user and aerial base-stations, mitigating inter-cell interference, capacity and coverage enhancements, and optimizing deployment locations of aerial base-stations. First, LTE signals were investigated using NS-3 simulation and software-defined radio experiment to gain knowledge on the quality of service experienced by the user equipment (UE). Using this understanding, a two-tier LTE-Advanced AG-HetNet with macro base-stations and unmanned aerial base-stations (UABS) is designed, while considering time-domain inter-cell interference coordination techniques. We maximize the capacity of this AG-HetNet in case of a damaged PSC infrastructure by jointly optimizing the inter-cell interference parameters and UABS locations using a meta-heuristic genetic algorithm (GA) and the brute-force technique. Finally, considering the latest specifications in 3GPP, a more realistic three-tier LTE-Advanced AG-HetNet is proposed with macro base-stations, pico base-stations, and ground UEs as terrestrial nodes and UABS and aerial UEs as aerial nodes. Using meta-heuristic techniques such as GA and elitist harmony search algorithm based on the GA, the critical network elements such as energy efficiency, inter-cell interference parameters, and UABS locations are all jointly optimized to maximize the capacity and coverage of the AG-HetNet

    A Comprehensive Overview on 5G-and-Beyond Networks with UAVs: From Communications to Sensing and Intelligence

    Full text link
    Due to the advancements in cellular technologies and the dense deployment of cellular infrastructure, integrating unmanned aerial vehicles (UAVs) into the fifth-generation (5G) and beyond cellular networks is a promising solution to achieve safe UAV operation as well as enabling diversified applications with mission-specific payload data delivery. In particular, 5G networks need to support three typical usage scenarios, namely, enhanced mobile broadband (eMBB), ultra-reliable low-latency communications (URLLC), and massive machine-type communications (mMTC). On the one hand, UAVs can be leveraged as cost-effective aerial platforms to provide ground users with enhanced communication services by exploiting their high cruising altitude and controllable maneuverability in three-dimensional (3D) space. On the other hand, providing such communication services simultaneously for both UAV and ground users poses new challenges due to the need for ubiquitous 3D signal coverage as well as the strong air-ground network interference. Besides the requirement of high-performance wireless communications, the ability to support effective and efficient sensing as well as network intelligence is also essential for 5G-and-beyond 3D heterogeneous wireless networks with coexisting aerial and ground users. In this paper, we provide a comprehensive overview of the latest research efforts on integrating UAVs into cellular networks, with an emphasis on how to exploit advanced techniques (e.g., intelligent reflecting surface, short packet transmission, energy harvesting, joint communication and radar sensing, and edge intelligence) to meet the diversified service requirements of next-generation wireless systems. Moreover, we highlight important directions for further investigation in future work.Comment: Accepted by IEEE JSA

    Study, Measurements and Characterisation of a 5G system using a Mobile Network Operator Testbed

    Get PDF
    The goals for 5G are aggressive. It promises to deliver enhanced end-user experience by offering new applications and services through gigabit speeds, and significantly improved performance and reliability. The enhanced mobile broadband (eMBB) 5G use case, for instance, targets peak data rates as high as 20 Gbps in the downlink (DL) and 10 Gbps in the uplink (UL). While there are different ways to improve data rates, spectrum is at the core of enabling higher mobile broadband data rates. 5G New Radio (NR) specifies new frequency bands below 6 GHz and also extends into mmWave frequencies where more contiguous bandwidth is available for sending lots of data. However, at mmWave frequencies, signals are more susceptible to impairments. Hence, extra consideration is needed to determine test approaches that provide the precision required to accurately evaluate 5G components and devices. Therefore, the aim of the thesis is to provide a deep dive into 5G technology, explore its testing and validation, and thereafter present the OTE (Hellenic Telecommunications Organisation) 5G testbed, including measurement results obtained and its characterisation based on key performance indicators (KPIs)
    corecore