483 research outputs found

    Evaluation, Modeling and Optimization of Coverage Enhancement Methods of NB-IoT

    Get PDF
    Narrowband Internet of Things (NB-IoT) is a new Low Power Wide Area Network (LPWAN) technology released by 3GPP. The primary goals of NB-IoT are improved coverage, massive capacity, low cost, and long battery life. In order to improve coverage, NB-IoT has promising solutions, such as increasing transmission repetitions, decreasing bandwidth, and adapting the Modulation and Coding Scheme (MCS). In this paper, we present an implementation of coverage enhancement features of NB-IoT in NS-3, an end-to-end network simulator. The resource allocation and link adaptation in NS-3 are modified to comply with the new features of NB-IoT. Using the developed simulation framework, the influence of the new features on network reliability and latency is evaluated. Furthermore, an optimal hybrid link adaptation strategy based on all three features is proposed. To achieve this, we formulate an optimization problem that has an objective function based on latency, and constraint based on the Signal to Noise Ratio (SNR). Then, we propose several algorithms to minimize latency and compare them with respect to accuracy and speed. The best hybrid solution is chosen and implemented in the NS-3 simulator by which the latency formulation is verified. The numerical results show that the proposed optimization algorithm for hybrid link adaptation is eight times faster than the exhaustive search approach and yields similar latency

    Prediction-Based Energy Saving Mechanism in 3GPP NB-IoT Networks

    Get PDF
    The current expansion of the Internet of things (IoT) demands improved communication platforms that support a wide area with low energy consumption. The 3rd Generation Partnership Project introduced narrowband IoT (NB-IoT) as IoT communication solutions. NB-IoT devices should be available for over 10 years without requiring a battery replacement. Thus, a low energy consumption is essential for the successful deployment of this technology. Given that a high amount of energy is consumed for radio transmission by the power amplifier, reducing the uplink transmission time is key to ensure a long lifespan of an IoT device. In this paper, we propose a prediction-based energy saving mechanism (PBESM) that is focused on enhanced uplink transmission. The mechanism consists of two parts: first, the network architecture that predicts the uplink packet occurrence through a deep packet inspection; second, an algorithm that predicts the processing delay and pre-assigns radio resources to enhance the scheduling request procedure. In this way, our mechanism reduces the number of random accesses and the energy consumed by radio transmission. Simulation results showed that the energy consumption using the proposed PBESM is reduced by up to 34% in comparison with that in the conventional NB-IoT method

    Architectures and Key Technical Challenges for 5G Systems Incorporating Satellites

    Get PDF
    Satellite Communication systems are a promising solution to extend and complement terrestrial networks in unserved or under-served areas. This aspect is reflected by recent commercial and standardisation endeavours. In particular, 3GPP recently initiated a Study Item for New Radio-based, i.e., 5G, Non-Terrestrial Networks aimed at deploying satellite systems either as a stand-alone solution or as an integration to terrestrial networks in mobile broadband and machine-type communication scenarios. However, typical satellite channel impairments, as large path losses, delays, and Doppler shifts, pose severe challenges to the realisation of a satellite-based NR network. In this paper, based on the architecture options currently being discussed in the standardisation fora, we discuss and assess the impact of the satellite channel characteristics on the physical and Medium Access Control layers, both in terms of transmitted waveforms and procedures for enhanced Mobile BroadBand (eMBB) and NarrowBand-Internet of Things (NB-IoT) applications. The proposed analysis shows that the main technical challenges are related to the PHY/MAC procedures, in particular Random Access (RA), Timing Advance (TA), and Hybrid Automatic Repeat reQuest (HARQ) and, depending on the considered service and architecture, different solutions are proposed.Comment: Submitted to Transactions on Vehicular Technologies, April 201

    Channel equalization and interference analysis for uplink Narrowband Internet of Things (NB-IoT)

    Get PDF
    We derive the uplink system model for In-band and Guard-band narrowband Internet of Things (NB-IoT). The results reveal that the actual channel frequency response (CFR) is not a simple Fourier transform of the channel impulse response, due to sampling rate mismatch between the NB-IoT user and Long Term Evolution (LTE) base station. Consequently, a new channel equalization algorithm is proposed based on the derived effective CFR. In addition, the interference is derived analytically to facilitate the co-existence of NB-IoT and LTE signals. This work provides an example and guidance to support network slicing and service multiplexing in the physical layer

    Deep Reinforcement Learning for Real-Time Optimization in NB-IoT Networks

    Get PDF
    NarrowBand-Internet of Things (NB-IoT) is an emerging cellular-based technology that offers a range of flexible configurations for massive IoT radio access from groups of devices with heterogeneous requirements. A configuration specifies the amount of radio resource allocated to each group of devices for random access and for data transmission. Assuming no knowledge of the traffic statistics, there exists an important challenge in "how to determine the configuration that maximizes the long-term average number of served IoT devices at each Transmission Time Interval (TTI) in an online fashion". Given the complexity of searching for optimal configuration, we first develop real-time configuration selection based on the tabular Q-learning (tabular-Q), the Linear Approximation based Q-learning (LA-Q), and the Deep Neural Network based Q-learning (DQN) in the single-parameter single-group scenario. Our results show that the proposed reinforcement learning based approaches considerably outperform the conventional heuristic approaches based on load estimation (LE-URC) in terms of the number of served IoT devices. This result also indicates that LA-Q and DQN can be good alternatives for tabular-Q to achieve almost the same performance with much less training time. We further advance LA-Q and DQN via Actions Aggregation (AA-LA-Q and AA-DQN) and via Cooperative Multi-Agent learning (CMA-DQN) for the multi-parameter multi-group scenario, thereby solve the problem that Q-learning agents do not converge in high-dimensional configurations. In this scenario, the superiority of the proposed Q-learning approaches over the conventional LE-URC approach significantly improves with the increase of configuration dimensions, and the CMA-DQN approach outperforms the other approaches in both throughput and training efficiency

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial

    Radio Resource Management in NB-IoT Systems:Empowered by Interference Prediction and Flexible Duplexing

    Get PDF
    NB-IoT is a promising cellular technology for enabling low cost, low power, long-range connectivity to IoT devices. With the bandwidth requirement of 180 kHz, it provides the flexibility to deploy within the existing LTE band. However, this raises serious concerns about the performance of the technology due to severe interference from multi-tier 5G HetNets. Furthermore, as NB-IoT is based on HD-FDD, the symmetric allocation of spectrum band between the downlink and uplink results in underutilization of resources, particularly in the case of asymmetric traffic distribution. Therefore, an innovative RRM strategy needs to be devised to improve spectrum efficiency and device connectivity. This article presents the detailed design challenges that need to be addressed for the RRM of NB-IoT and proposes a novel framework to devise an efficient resource allocation scheme by exploiting cooperative interference prediction and flexible duplexing techniques

    On the Latency-Energy Performance of NB-IoT Systems in Providing Wide-Area IoT Connectivity

    Get PDF
    corecore