661 research outputs found

    Random Linear Network Coding for 5G Mobile Video Delivery

    Get PDF
    An exponential increase in mobile video delivery will continue with the demand for higher resolution, multi-view and large-scale multicast video services. Novel fifth generation (5G) 3GPP New Radio (NR) standard will bring a number of new opportunities for optimizing video delivery across both 5G core and radio access networks. One of the promising approaches for video quality adaptation, throughput enhancement and erasure protection is the use of packet-level random linear network coding (RLNC). In this review paper, we discuss the integration of RLNC into the 5G NR standard, building upon the ideas and opportunities identified in 4G LTE. We explicitly identify and discuss in detail novel 5G NR features that provide support for RLNC-based video delivery in 5G, thus pointing out to the promising avenues for future research.Comment: Invited paper for Special Issue "Network and Rateless Coding for Video Streaming" - MDPI Informatio

    LTE Optimization and Resource Management in Wireless Heterogeneous Networks

    Get PDF
    Mobile communication technology is evolving with a great pace. The development of the Long Term Evolution (LTE) mobile system by 3GPP is one of the milestones in this direction. This work highlights a few areas in the LTE radio access network where the proposed innovative mechanisms can substantially improve overall LTE system performance. In order to further extend the capacity of LTE networks, an integration with the non-3GPP networks (e.g., WLAN, WiMAX etc.) is also proposed in this work. Moreover, it is discussed how bandwidth resources should be managed in such heterogeneous networks. The work has purposed a comprehensive system architecture as an overlay of the 3GPP defined SAE architecture, effective resource management mechanisms as well as a Linear Programming based analytical solution for the optimal network resource allocation problem. In addition, alternative computationally efficient heuristic based algorithms have also been designed to achieve near-optimal performance

    A Data-Driven Traffic Steering Algorithm for Optimizing User Experience in Multi-Tier LTE Networks

    Get PDF
    Multi-tier cellular networks are a cost-effective solution for capacity enhancement in urban scenarios. In these networks, effective mobility strategies are required to assign users to the most adequate layer. In this paper, a data-driven self-tuning algorithm for traffic steering is proposed to improve the overall Quality of Experience (QoE) in multi-carrier Long Term Evolution (LTE) networks. Traffic steering is achieved by changing Reference Signal Received Quality (RSRQ)-based inter-frequency handover margins. Unlike classical approaches considering cell-aggregated counters to drive the tuning process, the proposed algorithm relies on a novel indicator, derived from connection traces, showing the impact of handovers on user QoE. Method assessment is carried out in a dynamic system-level simulator implementing a real multicarrier LTE scenario. Results show that the proposed algorithm significantly improves QoE figures obtained with classical load balancing techniques.Spanish Ministry of Economy and Competitiveness under Grant TEC2015-69982-R, in part by the Spanish Ministry of Education, Culture and Sports under FPU Grant FPU17/04286, and in part by the Horizon 2020 Project ONE5G under Grant ICT-76080

    Spectrum Sharing, Latency, and Security in 5G Networks with Application to IoT and Smart Grid

    Get PDF
    The surge of mobile devices, such as smartphones, and tables, demands additional capacity. On the other hand, Internet-of-Things (IoT) and smart grid, which connects numerous sensors, devices, and machines require ubiquitous connectivity and data security. Additionally, some use cases, such as automated manufacturing process, automated transportation, and smart grid, require latency as low as 1 ms, and reliability as high as 99.99\%. To enhance throughput and support massive connectivity, sharing of the unlicensed spectrum (3.5 GHz, 5GHz, and mmWave) is a potential solution. On the other hand, to address the latency, drastic changes in the network architecture is required. The fifth generation (5G) cellular networks will embrace the spectrum sharing and network architecture modifications to address the throughput enhancement, massive connectivity, and low latency. To utilize the unlicensed spectrum, we propose a fixed duty cycle based coexistence of LTE and WiFi, in which the duty cycle of LTE transmission can be adjusted based on the amount of data. In the second approach, a multi-arm bandit learning based coexistence of LTE and WiFi has been developed. The duty cycle of transmission and downlink power are adapted through the exploration and exploitation. This approach improves the aggregated capacity by 33\%, along with cell edge and energy efficiency enhancement. We also investigate the performance of LTE and ZigBee coexistence using smart grid as a scenario. In case of low latency, we summarize the existing works into three domains in the context of 5G networks: core, radio and caching networks. Along with this, fundamental constraints for achieving low latency are identified followed by a general overview of exemplary 5G networks. Besides that, a loop-free, low latency and local-decision based routing protocol is derived in the context of smart grid. This approach ensures low latency and reliable data communication for stationary devices. To address data security in wireless communication, we introduce a geo-location based data encryption, along with node authentication by k-nearest neighbor algorithm. In the second approach, node authentication by the support vector machine, along with public-private key management, is proposed. Both approaches ensure data security without increasing the packet overhead compared to the existing approaches
    corecore