A deep reinforcement learning-based resource management scheme for SDN-MEC-supported XR applications

Abstract

The Multi-Access Edge Computing (MEC) paradigm provides a promising solution for efficient computing services at edge nodes, such as base stations (BS), access points (AP), etc. By offloading highly intensive computational tasks to MEC servers, critical benefits in terms of reducing energy consumption at mobile devices and lowering processing latency can be achieved to support high Quality of Service (QoS) to many applications. Among the services which would benefit from MEC deployments are eXtended Reality (XR) applications which are receiving increasing attention from both academia and industry. XR applications have high resource requirements, mostly in terms of network bandwidth, computation and storage. Often these resources are not available in classic network architectures and especially not when XR applications are run by mobile devices. This paper leverages the concepts of Software Defined Networking (SDN) and Network Function Virtualization (NFV) to propose an innovative resource management scheme considering heterogeneous QoS requirements at the MEC server level. The resource assignment is formulated by employing a Deep Reinforcement Learning (DRL) technique to support high quality of XR services. The simulation results show how our proposed solution outperforms other state-of-the-art resource management-based schemes

    Similar works