45 research outputs found

    Intelligent RACH Access strategies for M2M Traffic over Cellular Networks

    Get PDF
    This thesis investigates the coexistence of Machine-to-Machine (M2M) and Human-to-Human (H2H) based traffic sharing the Random Access Channel (RACH) of an existing cellular network and introduced a Q-learning as a mean of supporting the M2M traffic. The learning enables an intelligent slot selection strategy in order to avoid collisions amongst the M2M users during the RACH contest. It is also applied so that no central entity is involved in the slot selection process, to avoid tampering with the existing network standards. The thesis also introduces a novel back-off scheme for RACH access which provides separate frames for M2M and conventional cellular (H2H) retransmissions and is capable of dynamically adapting the frame size in order to maximise channel throughput. A Frame ALOHA for a Q-learning RACH access scheme is developed to realise collision- free RACH access between the H2H and M2M user groups. The scheme introduces a separate frame for H2H and M2M to use in both the first attempt and retransmissions. In addition analytical models are developed to examine the interaction of H2H and M2M traffic on the RACH channel, and to evaluate the throughput performance of both slotted ALOHA and Q-learning based access schemes. In general it is shown that Q-learning can be effectively applied for M2M traffic, significantly increasing the throughput capability of the channel with respect to conventional slotted ALOHA access. Dynamic adaptation of the back-off frames is shown to offer further improvements relative to a fixed frame scheme. Also the FA-QL-RACH scheme offers better performance than the QL-RACH and FB-QL-RACH scheme

    5GAuRA. D3.3: RAN Analytics Mechanisms and Performance Benchmarking of Video, Time Critical, and Social Applications

    Get PDF
    5GAuRA deliverable D3.3.This is the final deliverable of Work Package 3 (WP3) of the 5GAuRA project, providing a report on the project’s developments on the topics of Radio Access Network (RAN) analytics and application performance benchmarking. The focus of this deliverable is to extend and deepen the methods and results provided in the 5GAuRA deliverable D3.2 in the context of specific use scenarios of video, time critical, and social applications. In this respect, four major topics of WP3 of 5GAuRA – namely edge-cloud enhanced RAN architecture, machine learning assisted Random Access Channel (RACH) approach, Multi-access Edge Computing (MEC) content caching, and active queue management – are put forward. Specifically, this document provides a detailed discussion on the service level agreement between tenant and service provider in the context of network slicing in Fifth Generation (5G) communication networks. Network slicing is considered as a key enabler to 5G communication system. Legacy telecommunication networks have been providing various services to all kinds of customers through a single network infrastructure. In contrast, by deploying network slicing, operators are now able to partition one network into individual slices, each with its own configuration and Quality of Service (QoS) requirements. There are many applications across industry that open new business opportunities with new business models. Every application instance requires an independent slice with its own network functions and features, whereby every single slice needs an individual Service Level Agreement (SLA). In D3.3, we propose a comprehensive end-to-end structure of SLA between the tenant and the service provider of sliced 5G network, which balances the interests of both sides. The proposed SLA defines reliability, availability, and performance of delivered telecommunication services in order to ensure that right information is delivered to the right destination at right time, safely and securely. We also discuss the metrics of slicebased network SLA such as throughput, penalty, cost, revenue, profit, and QoS related metrics, which are, in the view of 5GAuRA, critical features of the agreement.Peer ReviewedPostprint (published version

    Prioritised Random Access Channel Protocols for Delay Critical M2M Communication over Cellular Networks

    Get PDF
    With the ever-increasing technological evolution, the current and future generation communication systems are geared towards accommodating Machine to Machine (M2M) communication as a necessary prerequisite for Internet of Things (IoT). Machine Type Communication (MTC) can sustain many promising applications through connecting a huge number of devices into one network. As current studies indicate, the number of devices is escalating at a high rate. Consequently, the network becomes congested because of its lower capacity, when the massive number of devices attempts simultaneous connection through the Random Access Channel (RACH). This results in RACH resource shortage, which can lead to high collision probability and massive access delay. Hence, it is critical to upgrade conventional Random Access (RA) techniques to support a massive number of Machine Type Communication (MTC) devices including Delay-Critical (DC) MTC. This thesis approaches to tackle this problem by modeling and optimising the access throughput and access delay performance of massive random access of M2M communications in Long-Term Evolution (LTE) networks. This thesis investigates the performance of different random access schemes in different scenarios. The study begins with the design and inspection of a group based 2-step Slotted-Aloha RACH (SA-RACH) scheme considering the coexistence of Human-to-Human (H2H) and M2M communication, the latter of which is categorised as: Delay-Critical user equipments (DC-UEs) and Non-Delay-Critical user equipments (NDC-UEs). Next, a novel RACH scheme termed the Priority-based Dynamic RACH (PD-RACH) model is proposed which utilises a coded preamble based collision probability model. Finally, being a key enabler of IoT, Machine Learning, i.e. a Q-learning based approach has been adopted, and a learning assisted Prioritised RACH scheme has been developed and investigated to prioritise a specific user group. In this work, the performance analysis of these novel RACH schemes show promising results compared to that of conventional RACH

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial

    Reliable Radio Access for Massive Machine-to-Machine (M2M) Communication

    Get PDF

    Energy efficiency in short and wide-area IoT technologies—A survey

    Get PDF
    In the last years, the Internet of Things (IoT) has emerged as a key application context in the design and evolution of technologies in the transition toward a 5G ecosystem. More and more IoT technologies have entered the market and represent important enablers in the deployment of networks of interconnected devices. As network and spatial device densities grow, energy efficiency and consumption are becoming an important aspect in analyzing the performance and suitability of different technologies. In this framework, this survey presents an extensive review of IoT technologies, including both Low-Power Short-Area Networks (LPSANs) and Low-Power Wide-Area Networks (LPWANs), from the perspective of energy efficiency and power consumption. Existing consumption models and energy efficiency mechanisms are categorized, analyzed and discussed, in order to highlight the main trends proposed in literature and standards toward achieving energy-efficient IoT networks. Current limitations and open challenges are also discussed, aiming at highlighting new possible research directions

    Random Access Analysis for Massive IoT Networks Under a New Spatio-Temporal Model: A Stochastic Geometry Approach

    Get PDF
    Massive Internet of Things (mIoT) has provided an auspicious opportunity to build powerful and ubiquitous connections that faces a plethora of new challenges, where cellular networks are potential solutions due to their high scalability, reliability, and efficiency. The Random Access CHannel (RACH) procedure is the first step of connection establishment between IoT devices and Base Stations (BSs) in the cellular-based mIoT network, where modelling the interactions between static properties of physical layer network and dynamic properties of queue evolving in each IoT device are challenging. To tackle this, we provide a novel traffic-aware spatio-temporal model to analyze RACH in cellular-based mIoT networks, where the physical layer network is modelled and analyzed based on stochastic geometry in the spatial domain, and the queue evolution is analyzed based on probability theory in the time domain. For performance evaluation, we derive the exact expressions for the preamble transmission success probabilities of a randomly chosen IoT device with different RACH schemes in each time slot, which offer insights into effectiveness of each RACH scheme. Our derived analytical results are verified by the realistic simulations capturing the evolution of packets in each IoT device. This mathematical model and analytical framework can be applied to evaluate the performance of other types of RACH schemes in the cellular-based networks by simply integrating its preamble transmission principle
    corecore