2 research outputs found

    Resource Allocation in Multi-access Edge Computing (MEC) Systems: Optimization and Machine Learning Algorithms

    Get PDF
    With the rapid proliferation of diverse wireless applications, the next generation of wireless networks are required to meet diverse quality of service (QoS) in various applications. The existing one-size-fits-all resource allocation algorithms will not be able to sustain the sheer need of supporting diverse QoS requirements. In this context, radio access network (RAN) slicing has been recently emerged as a promising approach to virtualize networks resources and create multiple logical network slices on a common physical infrastructure. Each slice can then be tailored to a specific application with distinct QoS requirement. This would considerably reduce the cost of infrastructure providers. However, efficient virtualized network slicing is only feasible if network resources are efficiently monitored and allocated. In the first part of this thesis, leveraging on tools from fractional programming and Augmented Lagrange method, I propose an efficient algorithm to jointly optimize users offloading decisions, communication, and computing resource allocation in a sliced multi-cell multi-access edge computing (MEC) network in the presence of interference. The objective is to minimize the weighted sum of the delay deviation observed at each slice from its corresponding delay requirement. The considered problem enables slice prioritization, cooperation among MEC servers, and partial offloading to multiple MEC servers. On another note, due to high computation and time complexity, traditional centralized optimization solutions are often rendered impractical and non-scalable for real-time resource allocation purposes. Thus, the need of machine learning algorithms has become more vital than ever before. To address this issue, in the second part of this thesis, exploiting the power of federated learning (FDL) and optimization theory, I develop a federated deep reinforcement learning framework for joint offloading decision and resource allocation in order to minimize the joint delay and energy consumption in a MEC-enabled internet-of-things (IoT) network with QoS constraints. The proposed algorithm is applied to an IoT network, since the IoT devices suffer significantly from limited computation and battery capacity. The proposed algorithm is distributed in nature, exploit cooperation among devices, preserves the privacy, and is executable on resource-limited cellular or IoT devices
    corecore