32,584 research outputs found

    Energy-Efficient Resource Allocation in Cloud and Fog Radio Access Networks

    Get PDF
    PhD ThesisWith the development of cloud computing, radio access networks (RAN) is migrating to fully or partially centralised architecture, such as Cloud RAN (C- RAN) or Fog RAN (F-RAN). The novel architectures are able to support new applications with the higher throughput, the higher energy e ciency and the better spectral e ciency performance. However, the more complex energy consumption features brought by these new architectures are challenging. In addition, the usage of Energy Harvesting (EH) technology and the computation o oading in novel architectures requires novel resource allocation designs.This thesis focuses on the energy e cient resource allocation for Cloud and Fog RAN networks. Firstly, a joint user association (UA) and power allocation scheme is proposed for the Heterogeneous Cloud Radio Access Networks with hybrid energy sources where Energy Harvesting technology is utilised. The optimisation problem is designed to maximise the utilisation of the renewable energy source. Through solving the proposed optimisation problem, the user association and power allocation policies are derived together to minimise the grid power consumption. Compared to the conventional UAs adopted in RANs, green power harvested by renewable energy source can be better utilised so that the grid power consumption can be greatly reduced with the proposed scheme. Secondly, a delay-aware energy e cient computation o oading scheme is proposed for the EH enabled F-RANs, where for access points (F-APs) are supported by renewable energy sources. The uneven distribution of the harvested energy brings in dynamics of the o oading design and a ects the delay experienced by users. The grid power minimisation problem is formulated. Based on the solutions derived, an energy e cient o oading decision algorithm is designed. Compared to SINR-based o oading scheme, the total grid power consumption of all F-APs can be reduced signi cantly with the proposed o oading decision algorithm while meeting the latency constraint. Thirdly, an energy-e cient computation o oading for mobile applications with shared data is investigated in a multi-user fog computing network. Taking the advantage of shared data property of latency-critical applications such as virtual reality (VR) and augmented reality (AR) into consideration, the energy minimisation problem is formulated. Then the optimal computation o oading and communications resources allocation policy is proposed which is able to minimise the overall energy consumption of mobile users and cloudlet server. Performance analysis indicates that the proposed policy outperforms other o oading schemes in terms of energy e ciency. The research works conducted in this thesis and the thorough performance analysis have revealed some insights on energy e cient resource allocation design in Cloud and Fog RANs

    Designing Parametric Constraint Based Power Aware Scheduling System in a Virtualized Cloud Environment

    Get PDF
    The increasing rate of the demand for computational resources has led to the production of largescale data centers. They consume huge amounts of electrical power resulting in high operational costs and carbon dioxide emissions. Power-related costs have become one of the major economic factors in IT data-centers, and companies and the research community are currently working on new efficient power aware resource management strategies, also known as 201C;Green IT201D;. Here we propose a framework for autonomic scheduling of tasks based upon some parametric constraints. In this paper we propose an analysis of the critical factors affecting the energy consumption of cloud servers in cloud computing and consideration to make performance very fast by using Sigar API to solve speed problems. In PCBPAS we impose some parametric constraints during task allocation to the server that can be adjusted dynamically to balance the server2019;s workloads in an efficient way so that CPU consumption can be improved and energy saving be achieved

    Energy-Efficient Management of Data Center Resources for Cloud Computing: A Vision, Architectural Elements, and Open Challenges

    Full text link
    Cloud computing is offering utility-oriented IT services to users worldwide. Based on a pay-as-you-go model, it enables hosting of pervasive applications from consumer, scientific, and business domains. However, data centers hosting Cloud applications consume huge amounts of energy, contributing to high operational costs and carbon footprints to the environment. Therefore, we need Green Cloud computing solutions that can not only save energy for the environment but also reduce operational costs. This paper presents vision, challenges, and architectural elements for energy-efficient management of Cloud computing environments. We focus on the development of dynamic resource provisioning and allocation algorithms that consider the synergy between various data center infrastructures (i.e., the hardware, power units, cooling and software), and holistically work to boost data center energy efficiency and performance. In particular, this paper proposes (a) architectural principles for energy-efficient management of Clouds; (b) energy-efficient resource allocation policies and scheduling algorithms considering quality-of-service expectations, and devices power usage characteristics; and (c) a novel software technology for energy-efficient management of Clouds. We have validated our approach by conducting a set of rigorous performance evaluation study using the CloudSim toolkit. The results demonstrate that Cloud computing model has immense potential as it offers significant performance gains as regards to response time and cost saving under dynamic workload scenarios.Comment: 12 pages, 5 figures,Proceedings of the 2010 International Conference on Parallel and Distributed Processing Techniques and Applications (PDPTA 2010), Las Vegas, USA, July 12-15, 201

    A Taxonomy for Management and Optimization of Multiple Resources in Edge Computing

    Full text link
    Edge computing is promoted to meet increasing performance needs of data-driven services using computational and storage resources close to the end devices, at the edge of the current network. To achieve higher performance in this new paradigm one has to consider how to combine the efficiency of resource usage at all three layers of architecture: end devices, edge devices, and the cloud. While cloud capacity is elastically extendable, end devices and edge devices are to various degrees resource-constrained. Hence, an efficient resource management is essential to make edge computing a reality. In this work, we first present terminology and architectures to characterize current works within the field of edge computing. Then, we review a wide range of recent articles and categorize relevant aspects in terms of 4 perspectives: resource type, resource management objective, resource location, and resource use. This taxonomy and the ensuing analysis is used to identify some gaps in the existing research. Among several research gaps, we found that research is less prevalent on data, storage, and energy as a resource, and less extensive towards the estimation, discovery and sharing objectives. As for resource types, the most well-studied resources are computation and communication resources. Our analysis shows that resource management at the edge requires a deeper understanding of how methods applied at different levels and geared towards different resource types interact. Specifically, the impact of mobility and collaboration schemes requiring incentives are expected to be different in edge architectures compared to the classic cloud solutions. Finally, we find that fewer works are dedicated to the study of non-functional properties or to quantifying the footprint of resource management techniques, including edge-specific means of migrating data and services.Comment: Accepted in the Special Issue Mobile Edge Computing of the Wireless Communications and Mobile Computing journa

    Energy-aware dynamic virtual machine consolidation for cloud datacenters

    Get PDF
    • …
    corecore