2,933 research outputs found

    Optimization Of Operational Costs In Data Centers

    Get PDF
    The electricity cost of cloud computing data centers dominated by server power and cooling power is growing rapidly. To tackle this problem, inlet air with moderate temperature and server consolidation are widely adopted. However, the benefit of these two methods is limited due to conventional air cooling systems ineffectiveness caused by re-circulation and low heat capacity. To address this problem, hybrid air and liquid cooling, as a practical and inexpensive approach, has been introduced. In this work, we quantitatively analyze the impact of server consolidation and temperature of cooling water on the total electricity and server maintenance costs in hybrid cooling data centers. To minimize the total costs, we proposed to maintain sweet temperature and ASTT (available sleeping time threshold) by which a joint cost optimization can be satisfied. By using real world traces, the potential savings of sweet temperature and ASTT are estimated to be average 18% of the total cost while 99% requests are satisfied compared to a strategy which only reduces electricity cost. The co-optimization is extended to increase the benefit of the renewable energy and its profit grows as the more wind power is supplied

    Techno-Economic Study on The Alternative Power and Cooling Systems Design for Cost & Energy-Efficient Edge Cloud Data Center(s)

    Get PDF
    The 5G technology has enabled performance-sensitive applications with low latency and high bandwidth requirements, which has put more low latency requirements on computing services. To answer this need, a small-scale data center called edge cloud is predicted to grow fast in the future. Due to its nature of being close to the end-users, the growth of edge clouds in the populated area may cause a problem with the existing power system. Besides this power system challenge, the edge cloud also requires a higher resource cost than the hyper- scale data center because of the economies of scale. In this thesis, four viable alternative power and cooling technologies are introduced to address those challenges. These four technologies are solar PV, Vertical Axis Wind Turbine (VAWT), Rear Door Heat Exchanger (RDHx), and immersion cooling. Detailed data of edge cloud are required to understand the contribution of these four technologies. However, due to the infancy state of edge cloud, those data are unavailable, and assumptions regarding data are made. Besides that, a cost model for an edge cloud is also required to show how significant the contribution of those alternative technologies is if compared to the total cost of ownership. In this thesis, the cost model for the edge cloud is extended for the alternative power and cooling system scenarios. Along with the assumed data of an edge cloud, sensitivity analysis is performed to determine whether the alternative power and cooling technologies can bring down the cost of edge cloud resources or not. Through the cost modeling, it was found out that VAWT and immersion cooling is not feasible for the particular assumed data center. On the other hand, solar PV can save 4.55% of data center electricity consumption (equal to 0.21% reduction of the total expense when calculated using the current electricity price). Furthermore, RDHx performed better with 22.73% of data center electricity expenses (equivalent to 8.35% of saving from total cost when calculated using the current electricity price)

    Sustainability in astroparticle physics

    Get PDF
    The topic of sustainability is becoming increasingly important in research activities in astroparticle physics, both in existing and also in future instrument. At this year\u27s International cosmic ray conference (ICRC 2021) one session was dedicated to this topic. This publication will summarise the findings of this well-attended online session

    Economic Analysis of a Data Center Virtual Power Plant Participating in Demand Response

    Get PDF
    Data centers consume a significant amount of energy from the grid, and the number of data centers are increasing at a high rate. As the amount of demand on the transmission system increases, network congestion reduces the economic efficiency of the grid and begins to risk failure. Data centers have underutilized energy resources, such as backup generators and battery storage, which can be used for demand response (DR) to benefit both the electric power system and the data center. Therefore, data center energy resources, including renewable energy, are aggregated and controlled using an energy management system (EMS) to operate as a virtual power plant (VPP). The data center as a VPP participates in a day-ahead DR program to relieve network congestion and improve market efficiency. Data centers mostly use lead-acid batteries for energy reserve in Uninterruptible Power Supply (UPS) systems that ride through power fluctuations and short term power outages. These batteries are sized according to the power requirement of the data center and the backup power duration required for reliable operation of the data center. Most of the time, these batteries remain on float charge, with seldom charging and discharging cycles. Batteries have a limited float life, where at the end of the float life, the battery is assumed dead, and require replacement. Therefore, the unused energy of the battery can be utilized by allocating a daily energy budget limit without affecting the overall float life of the battery used in data center for the purpose of DR. This is incorporated as a soft constraint in the EMS model, and the extra use of battery energy over the daily budget limit will account for the wear cost of the battery. A case study is conducted in which the data center is placed on a modified version of the IEEE 30-bus test system to evaluate the potential economic savings by participating in the DR program, coordinated by the Independent System Operator (ISO). We show that the savings of the data center operating as a VPP and participating in the DR program far outweighs the additional expense due to operating its own generators and batteries

    Potential value of waste heat energy from data center in Norway

    Get PDF
    The European Union (EU) has set ambitious climate goals through the European Green Deal, aiming for climate neutrality by 2050. As a result, utilization of waste heat has gained a momentum due to its potential value in society. Data centers are known for their significant power consumption and cooling needs and present an opportunity as a waste heat emitter. This thesis investigates the potential value of waste heat energy from a data center in Norway, with focus on the Green Edge Compute data center in Stavanger. The thesis assessment considers the context of infrastructure and energy grid distribution, aiming to identify sustainable integrated waste heat solutions. Technical feasibility for integrating waste heat from data centers with district heating systems is examined, considering the low output temperature from the data center and the seasonal variations in supply and demand. Key factors such as environmental impact, community development and economy are evaluated to determine the potential value of waste heat utilization. Due to the relatively new nature of waste heat from data centers through liquid medium, the limited availability of data and previous research in this specific field leads to broad and general findings. To assess the potential value of waste heat energy from data centers a comprehensive literature review was conducted, selecting a relevant case study from Norway as primary source. Collecting and analyzing data, evaluating value factors, identifying limitations, and proposing a solution for high value were the main steps to contribute to the understanding and implementation of waste heat utilization. Quantitative research was the main method applied. To enhance nuanced analysis because of lacking test data, a small-scale anonymous qualitative study was performed, discussing with industry experts to gather professional insights. This thesis explores the potential value of waste heat energy from a data center and highlights the significance in achieving energy efficiency, environmental sustainability, and economic benefits. By harnessing waste heat, data centers can reduce their reliance on conventional heating systems, resulting in energy conservation and operational efficiency. Implementing waste heat recovery systems offers advantages such as cost savings, economic growth, and employment possibilities. Integrating waste heat into energy grid systems and promoting collaborative initiatives enhances the value of utilizing waste heat from data centers

    Electronics Thermal Management in Information and Communications Technologies: Challenges and Future Directions

    Get PDF
    This paper reviews thermal management challenges encountered in a wide range of electronics cooling applications from large-scale (data center and telecommunication) to smallscale systems (personal, portable/wearable, and automotive). This paper identifies drivers for progress and immediate and future challenges based on discussions at the 3rd Workshop on Thermal Management in Telecommunication Systems and Data Centers held in Redwood City, CA, USA, on November 4–5, 2015. Participants in this workshop represented industry and academia, with backgrounds ranging from data center thermal management and energy efficiency to high-performance computing and liquid cooling, thermal management in wearable and mobile devices, and acoustic noise management. By considering a wide range of electronics cooling applications with different lengths and time scales, this paper identifies both common themes and diverging views in the thermal management community

    Examining Future Data Center Power Supply Infrastructures

    Get PDF
    The rapid expansion of data processing in the past few years has created a massive demand for data center installations worldwide, and energy conservation strategies have become crucial. The enormous increase in data center installations and their significant contribution to global energy consumption require the implementing of energy saving techniques and participating in supporting the power grid. This thesis presents an architecture-level review of power distribution systems in data centers, examining AC, DC, and hybrid architectures with a focus on enhancing efficiency and reliability One of the key areas that can be enhanced to improve the overall energy efficiency of data centers and the provision of ancillary services for the grid is the Uninterruptible Power Supplies (UPS). This thesis reviews the current state-of-the-art power supply systems and topologies mainly used in data centers and aims to identify ways to increase the overall energy efficiency of data center power supply systems. Moreover, this work presents a detailed analysis of the power supply losses and proposes systems that can improve the con- version efficiency of UPS systems under various loading conditions. The performance metrics in the data center business need to be more accurate. Therefore, the variety of performance metrics, considering energy efficiency, sustainability, reliability and costs, are analysed in the thesis. The conclusion of the thesis wraps up the findings and provides guidelines for planning the power supply infrastructure for various conditions
    • …
    corecore