1,399 research outputs found

    Performance Analysis of Unsupervised LTE Device-to-Device (D2D) Communication

    Full text link
    Cellular network technology based device-to-device communication attracts increasing attention for use cases such as the control of autonomous vehicles on the ground and in the air. LTE provides device-to-device communication options, however, the configuration options are manifold (leading to 150+ possible combinations) and therefore the ideal combination of parameters is hard to find. Depending on the use case, either throughput, reliability or latency constraints may be the primary concern of the service provider. In this work we analyze the impact of different configuration settings of unsupervised LTE device-to-device (sidelink) communication on the system performance. Using a simulative approach we vary the length of the PSCCH period and the number of PSCCH subframes and determine the impact of different combinations of those parameters on the resulting latency, reliability and the interarrival times of the received packets. Furthermore we examine the system limitations by a scalability analysis. In this context, we propose a modified HARQ process to mitigate scalability constraints. Our results show that the proposed reduced HARQ retransmission probability can increase the system performance regarding latency and interarrival times as well as the packet transmission reliability for higher channel utilization

    Enhanced Machine Learning Techniques for Early HARQ Feedback Prediction in 5G

    Full text link
    We investigate Early Hybrid Automatic Repeat reQuest (E-HARQ) feedback schemes enhanced by machine learning techniques as a path towards ultra-reliable and low-latency communication (URLLC). To this end, we propose machine learning methods to predict the outcome of the decoding process ahead of the end of the transmission. We discuss different input features and classification algorithms ranging from traditional methods to newly developed supervised autoencoders. These methods are evaluated based on their prospects of complying with the URLLC requirements of effective block error rates below 10−510^{-5} at small latency overheads. We provide realistic performance estimates in a system model incorporating scheduling effects to demonstrate the feasibility of E-HARQ across different signal-to-noise ratios, subcode lengths, channel conditions and system loads, and show the benefit over regular HARQ and existing E-HARQ schemes without machine learning.Comment: 14 pages, 15 figures; accepted versio

    Green Communication via Power-optimized HARQ Protocols

    Get PDF
    Recently, efficient use of energy has become an essential research topic for green communication. This paper studies the effect of optimal power controllers on the performance of delay-sensitive communication setups utilizing hybrid automatic repeat request (HARQ). The results are obtained for repetition time diversity (RTD) and incremental redundancy (INR) HARQ protocols. In all cases, the optimal power allocation, minimizing the outage-limited average transmission power, is obtained under both continuous and bursting communication models. Also, we investigate the system throughput in different conditions. The results indicate that the power efficiency is increased substantially, if adaptive power allocation is utilized. For instance, assume Rayleigh-fading channel, a maximum of two (re)transmission rounds with rates {1,12}\{1,\frac{1}{2}\} nats-per-channel-use and an outage probability constraint 10−3{10}^{-3}. Then, compared to uniform power allocation, optimal power allocation in RTD reduces the average power by 9 and 11 dB in the bursting and continuous communication models, respectively. In INR, these values are obtained to be 8 and 9 dB, respectively.Comment: Accepted for publication on IEEE Transactions on Vehicular Technolog

    End-to-End Simulation of 5G mmWave Networks

    Full text link
    Due to its potential for multi-gigabit and low latency wireless links, millimeter wave (mmWave) technology is expected to play a central role in 5th generation cellular systems. While there has been considerable progress in understanding the mmWave physical layer, innovations will be required at all layers of the protocol stack, in both the access and the core network. Discrete-event network simulation is essential for end-to-end, cross-layer research and development. This paper provides a tutorial on a recently developed full-stack mmWave module integrated into the widely used open-source ns--3 simulator. The module includes a number of detailed statistical channel models as well as the ability to incorporate real measurements or ray-tracing data. The Physical (PHY) and Medium Access Control (MAC) layers are modular and highly customizable, making it easy to integrate algorithms or compare Orthogonal Frequency Division Multiplexing (OFDM) numerologies, for example. The module is interfaced with the core network of the ns--3 Long Term Evolution (LTE) module for full-stack simulations of end-to-end connectivity, and advanced architectural features, such as dual-connectivity, are also available. To facilitate the understanding of the module, and verify its correct functioning, we provide several examples that show the performance of the custom mmWave stack as well as custom congestion control algorithms designed specifically for efficient utilization of the mmWave channel.Comment: 25 pages, 16 figures, submitted to IEEE Communications Surveys and Tutorials (revised Jan. 2018
    • 

    corecore