1,550 research outputs found

    Prediction-Based Energy Saving Mechanism in 3GPP NB-IoT Networks

    Get PDF
    The current expansion of the Internet of things (IoT) demands improved communication platforms that support a wide area with low energy consumption. The 3rd Generation Partnership Project introduced narrowband IoT (NB-IoT) as IoT communication solutions. NB-IoT devices should be available for over 10 years without requiring a battery replacement. Thus, a low energy consumption is essential for the successful deployment of this technology. Given that a high amount of energy is consumed for radio transmission by the power amplifier, reducing the uplink transmission time is key to ensure a long lifespan of an IoT device. In this paper, we propose a prediction-based energy saving mechanism (PBESM) that is focused on enhanced uplink transmission. The mechanism consists of two parts: first, the network architecture that predicts the uplink packet occurrence through a deep packet inspection; second, an algorithm that predicts the processing delay and pre-assigns radio resources to enhance the scheduling request procedure. In this way, our mechanism reduces the number of random accesses and the energy consumed by radio transmission. Simulation results showed that the energy consumption using the proposed PBESM is reduced by up to 34% in comparison with that in the conventional NB-IoT method

    Control-data separation architecture for cellular radio access networks: a survey and outlook

    Get PDF
    Conventional cellular systems are designed to ensure ubiquitous coverage with an always present wireless channel irrespective of the spatial and temporal demand of service. This approach raises several problems due to the tight coupling between network and data access points, as well as the paradigm shift towards data-oriented services, heterogeneous deployments and network densification. A logical separation between control and data planes is seen as a promising solution that could overcome these issues, by providing data services under the umbrella of a coverage layer. This article presents a holistic survey of existing literature on the control-data separation architecture (CDSA) for cellular radio access networks. As a starting point, we discuss the fundamentals, concepts, and general structure of the CDSA. Then, we point out limitations of the conventional architecture in futuristic deployment scenarios. In addition, we present and critically discuss the work that has been done to investigate potential benefits of the CDSA, as well as its technical challenges and enabling technologies. Finally, an overview of standardisation proposals related to this research vision is provided

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial

    SymbioCity: Smart Cities for Smarter Networks

    Get PDF
    The "Smart City" (SC) concept revolves around the idea of embodying cutting-edge ICT solutions in the very fabric of future cities, in order to offer new and better services to citizens while lowering the city management costs, both in monetary, social, and environmental terms. In this framework, communication technologies are perceived as subservient to the SC services, providing the means to collect and process the data needed to make the services function. In this paper, we propose a new vision in which technology and SC services are designed to take advantage of each other in a symbiotic manner. According to this new paradigm, which we call "SymbioCity", SC services can indeed be exploited to improve the performance of the same communication systems that provide them with data. Suggestive examples of this symbiotic ecosystem are discussed in the paper. The dissertation is then substantiated in a proof-of-concept case study, where we show how the traffic monitoring service provided by the London Smart City initiative can be used to predict the density of users in a certain zone and optimize the cellular service in that area.Comment: 14 pages, submitted for publication to ETT Transactions on Emerging Telecommunications Technologie

    Radio Resource Management in NB-IoT Systems:Empowered by Interference Prediction and Flexible Duplexing

    Get PDF
    NB-IoT is a promising cellular technology for enabling low cost, low power, long-range connectivity to IoT devices. With the bandwidth requirement of 180 kHz, it provides the flexibility to deploy within the existing LTE band. However, this raises serious concerns about the performance of the technology due to severe interference from multi-tier 5G HetNets. Furthermore, as NB-IoT is based on HD-FDD, the symmetric allocation of spectrum band between the downlink and uplink results in underutilization of resources, particularly in the case of asymmetric traffic distribution. Therefore, an innovative RRM strategy needs to be devised to improve spectrum efficiency and device connectivity. This article presents the detailed design challenges that need to be addressed for the RRM of NB-IoT and proposes a novel framework to devise an efficient resource allocation scheme by exploiting cooperative interference prediction and flexible duplexing techniques
    corecore