2,896 research outputs found

    Energy Efficient Virtual Machines Placement Over Cloud-Fog Network Architecture

    Get PDF
    Fog computing is an emerging paradigm that aims to improve the efficiency and QoS of cloud computing by extending the cloud to the edge of the network. This paper develops a comprehensive energy efficiency analysis framework based on mathematical modeling and heuristics to study the offloading of virtual machine (VM) services from the cloud to the fog. The analysis addresses the impact of different factors including the traffic between the VM and its users, the VM workload, the workload versus number of users profile and the proximity of fog nodes to users. Overall, the power consumption can be reduced if the VM users’ traffic is high and/or the VMs have a linear power profile. In such a linear profile case, the creation of multiple VM replicas does not increase the computing power consumption significantly (there may be a slight increase due to idle / baseline power consumption) if the number of users remains constant, however the VM replicas can be brought closer to the end users, thus reducing the transport network power consumption. In our scenario, the optimum placement of VMs over a cloud-fog architecture significantly decreased the total power consumption by 56% and 64% under high user data rates compared to optimized distributed clouds placement and placement in the existing AT&T network cloud locations, respectively

    A Taxonomy for Management and Optimization of Multiple Resources in Edge Computing

    Full text link
    Edge computing is promoted to meet increasing performance needs of data-driven services using computational and storage resources close to the end devices, at the edge of the current network. To achieve higher performance in this new paradigm one has to consider how to combine the efficiency of resource usage at all three layers of architecture: end devices, edge devices, and the cloud. While cloud capacity is elastically extendable, end devices and edge devices are to various degrees resource-constrained. Hence, an efficient resource management is essential to make edge computing a reality. In this work, we first present terminology and architectures to characterize current works within the field of edge computing. Then, we review a wide range of recent articles and categorize relevant aspects in terms of 4 perspectives: resource type, resource management objective, resource location, and resource use. This taxonomy and the ensuing analysis is used to identify some gaps in the existing research. Among several research gaps, we found that research is less prevalent on data, storage, and energy as a resource, and less extensive towards the estimation, discovery and sharing objectives. As for resource types, the most well-studied resources are computation and communication resources. Our analysis shows that resource management at the edge requires a deeper understanding of how methods applied at different levels and geared towards different resource types interact. Specifically, the impact of mobility and collaboration schemes requiring incentives are expected to be different in edge architectures compared to the classic cloud solutions. Finally, we find that fewer works are dedicated to the study of non-functional properties or to quantifying the footprint of resource management techniques, including edge-specific means of migrating data and services.Comment: Accepted in the Special Issue Mobile Edge Computing of the Wireless Communications and Mobile Computing journa
    • …
    corecore