3,042 research outputs found

    Prediction-Based Energy Saving Mechanism in 3GPP NB-IoT Networks

    Get PDF
    The current expansion of the Internet of things (IoT) demands improved communication platforms that support a wide area with low energy consumption. The 3rd Generation Partnership Project introduced narrowband IoT (NB-IoT) as IoT communication solutions. NB-IoT devices should be available for over 10 years without requiring a battery replacement. Thus, a low energy consumption is essential for the successful deployment of this technology. Given that a high amount of energy is consumed for radio transmission by the power amplifier, reducing the uplink transmission time is key to ensure a long lifespan of an IoT device. In this paper, we propose a prediction-based energy saving mechanism (PBESM) that is focused on enhanced uplink transmission. The mechanism consists of two parts: first, the network architecture that predicts the uplink packet occurrence through a deep packet inspection; second, an algorithm that predicts the processing delay and pre-assigns radio resources to enhance the scheduling request procedure. In this way, our mechanism reduces the number of random accesses and the energy consumed by radio transmission. Simulation results showed that the energy consumption using the proposed PBESM is reduced by up to 34% in comparison with that in the conventional NB-IoT method

    Deep Q-Learning for Self-Organizing Networks Fault Management and Radio Performance Improvement

    Full text link
    We propose an algorithm to automate fault management in an outdoor cellular network using deep reinforcement learning (RL) against wireless impairments. This algorithm enables the cellular network cluster to self-heal by allowing RL to learn how to improve the downlink signal to interference plus noise ratio through exploration and exploitation of various alarm corrective actions. The main contributions of this paper are to 1) introduce a deep RL-based fault handling algorithm which self-organizing networks can implement in a polynomial runtime and 2) show that this fault management method can improve the radio link performance in a realistic network setup. Simulation results show that our proposed algorithm learns an action sequence to clear alarms and improve the performance in the cellular cluster better than existing algorithms, even against the randomness of the network fault occurrences and user movements.Comment: (c) 2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other work

    Intra-Cluster Autonomous Coverage Optimization For Dense LTE-A Networks

    Full text link
    Self Organizing Networks (SONs) are considered as vital deployments towards upcoming dense cellular networks. From a mobile carrier point of view, continuous coverage optimization is critical for better user perceptions. The majority of SON contributions introduce novel algorithms that optimize specific performance metrics. However, they require extensive processing delays and advanced knowledge of network statistics that may not be available. In this work, a progressive Autonomous Coverage Optimization (ACO) method combined with adaptive cell dimensioning is proposed. The proposed method emphasizes the fact that the effective cell coverage is a variant on actual user distributions. ACO algorithm builds a generic Space-Time virtual coverage map per cell to detect coverage holes in addition to limited or extended coverage conditions. Progressive levels of optimization are followed to timely resolve coverage issues with maintaining optimization stability. Proposed ACO is verified under both simulations and practical deployment in a pilot cluster for a worldwide mobile carrier. Key Performance Indicators show that proposed ACO method significantly enhances system coverage and performance.Comment: conferenc

    Evaluation, Modeling and Optimization of Coverage Enhancement Methods of NB-IoT

    Get PDF
    Narrowband Internet of Things (NB-IoT) is a new Low Power Wide Area Network (LPWAN) technology released by 3GPP. The primary goals of NB-IoT are improved coverage, massive capacity, low cost, and long battery life. In order to improve coverage, NB-IoT has promising solutions, such as increasing transmission repetitions, decreasing bandwidth, and adapting the Modulation and Coding Scheme (MCS). In this paper, we present an implementation of coverage enhancement features of NB-IoT in NS-3, an end-to-end network simulator. The resource allocation and link adaptation in NS-3 are modified to comply with the new features of NB-IoT. Using the developed simulation framework, the influence of the new features on network reliability and latency is evaluated. Furthermore, an optimal hybrid link adaptation strategy based on all three features is proposed. To achieve this, we formulate an optimization problem that has an objective function based on latency, and constraint based on the Signal to Noise Ratio (SNR). Then, we propose several algorithms to minimize latency and compare them with respect to accuracy and speed. The best hybrid solution is chosen and implemented in the NS-3 simulator by which the latency formulation is verified. The numerical results show that the proposed optimization algorithm for hybrid link adaptation is eight times faster than the exhaustive search approach and yields similar latency
    corecore