55 research outputs found

    AI-driven, Context-Aware Profiling for 5G and Beyond Networks

    Get PDF
    In the era of Industrial Internet of Things (IIoT) and Industry 4.0, an immense volume of heterogeneous network devices will coexist and contend for shared network resources, in order to satisfy the very challenging IIoT applications, requiring ultra-reliable and ultra-low latency communications. Although novel key enablers, such as Network Slicing, Software Defined Networking (SDN) and Network Function Virtualization (NFV) have already offered significant advantages towards more efficient and flexible network and resource management approaches, the particular characteristics of IIoT applications pose additional burdens, mainly due to the complex wireless environments, high number of heterogeneous network devices, sensors, user equipments (UEs), etc., which may stochastically demand and contend for the -often scarce -computing and communication resources of industrial environments. To this end, this paper introduces PRIMATE, a novel, Artificial Intelligence (AI)-driven framework for the profiling of the networking behavior of such UEs, devices, users and things, which is able to operate in conjunction with already standardized or forthcoming, AI-based network resource management processes towards further gains. The novelty and potential of the proposed work lies on the fact that instead of attempting to either predict raw network metrics in a reactive manner, or predict the behavior of specific network entities/devices in an isolated manner, a big data-driven classification approach is introduced, which models the behavior of any network device/user from both a macroscopic, as well as service-specific perspective. The extended evaluation at the last part of this work shows the validity and viability of the proposed framework.This work has been partially supported by EC H2020 5GPPP 5Growth project (Grant 856709)

    On the Design of Future Communication Systems with Coded Transport, Storage, and Computing

    Get PDF
    Communication systems are experiencing a fundamental change. There are novel applications that require an increased performance not only of throughput but also latency, reliability, security, and heterogeneity support from these systems. To fulfil the requirements, future systems understand communication not only as the transport of bits but also as their storage, processing, and relation. In these systems, every network node has transport storage and computing resources that the network operator and its users can exploit through virtualisation and softwarisation of the resources. It is within this context that this work presents its results. We proposed distributed coded approaches to improve communication systems. Our results improve the reliability and latency performance of the transport of information. They also increase the reliability, flexibility, and throughput of storage applications. Furthermore, based on the lessons that coded approaches improve the transport and storage performance of communication systems, we propose a distributed coded approach for the computing of novel in-network applications such as the steering and control of cyber-physical systems. Our proposed approach can increase the reliability and latency performance of distributed in-network computing in the presence of errors, erasures, and attackers

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial
    • …
    corecore