7,699 research outputs found

    Energy-Efficient Management of Data Center Resources for Cloud Computing: A Vision, Architectural Elements, and Open Challenges

    Full text link
    Cloud computing is offering utility-oriented IT services to users worldwide. Based on a pay-as-you-go model, it enables hosting of pervasive applications from consumer, scientific, and business domains. However, data centers hosting Cloud applications consume huge amounts of energy, contributing to high operational costs and carbon footprints to the environment. Therefore, we need Green Cloud computing solutions that can not only save energy for the environment but also reduce operational costs. This paper presents vision, challenges, and architectural elements for energy-efficient management of Cloud computing environments. We focus on the development of dynamic resource provisioning and allocation algorithms that consider the synergy between various data center infrastructures (i.e., the hardware, power units, cooling and software), and holistically work to boost data center energy efficiency and performance. In particular, this paper proposes (a) architectural principles for energy-efficient management of Clouds; (b) energy-efficient resource allocation policies and scheduling algorithms considering quality-of-service expectations, and devices power usage characteristics; and (c) a novel software technology for energy-efficient management of Clouds. We have validated our approach by conducting a set of rigorous performance evaluation study using the CloudSim toolkit. The results demonstrate that Cloud computing model has immense potential as it offers significant performance gains as regards to response time and cost saving under dynamic workload scenarios.Comment: 12 pages, 5 figures,Proceedings of the 2010 International Conference on Parallel and Distributed Processing Techniques and Applications (PDPTA 2010), Las Vegas, USA, July 12-15, 201

    The AEI 10 m prototype interferometer

    Get PDF
    A 10 m prototype interferometer facility is currently being set up at the AEI in Hannover, Germany. The prototype interferometer will be housed inside a 100 m^3 ultra-high vacuum envelope. Seismically isolated optical tables inside the vacuum system will be interferometrically interconnected via a suspension platform interferometer. Advanced isolation techniques will be used, such as inverted pendulums and geometrical anti-spring filters in combination with multiple-cascaded pendulum suspensions, containing an all-silica monolithic last stage. The light source is a 35 W Nd:YAG laser, geometrically filtered by passing it through a photonic crystal fibre and a rigid pre-modecleaner cavity. Laser frequency stabilisation will be achieved with the aid of a high finesse suspended reference cavity in conjunction with a molecular iodine reference. Coating thermal noise will be reduced by the use of Khalili cavities as compound end mirrors. Data acquisition and control of the experiments is based on the AdvLIGO digital control and data system. The aim of the project is to test advanced techniques for GEO 600 as well as to conduct experiments in macroscopic quantum mechanics. Reaching standard quantum-limit sensitivity for an interferometer with 100 g mirrors and subsequently breaching this limit, features most prominently among these experiments. In this paper we present the layout and current status of the AEI 10 m Prototype Interferometer project

    Magneto-optic contact for application in an amplifying waveguide optical isolator

    Get PDF
    We present the development of a metal-semiconductor contact for a TM-mode amplijying waveguide optical isolator and show that it is a compromise between good (magneto-)optical performance and good electrical behavior

    GEO 600 and the GEO-HF upgrade program: successes and challenges

    Get PDF
    The German-British laser-interferometric gravitational wave detector GEO 600 is in its 14th year of operation since its first lock in 2001. After GEO 600 participated in science runs with other first-generation detectors, a program known as GEO-HF began in 2009. The goal was to improve the detector sensitivity at high frequencies, around 1 kHz and above, with technologically advanced yet minimally invasive upgrades. Simultaneously, the detector would record science quality data in between commissioning activities. As of early 2014, all of the planned upgrades have been carried out and sensitivity improvements of up to a factor of four at the high-frequency end of the observation band have been achieved. Besides science data collection, an experimental program is ongoing with the goal to further improve the sensitivity and evaluate future detector technologies. We summarize the results of the GEO-HF program to date and discuss its successes and challenges

    Self-organizing maps versus growing neural Gas in detecting anomalies in data centers

    Get PDF
    Reliability is one of the key performance factors in data centres. The out-of-scale energy costs of these facilities lead data centre operators to increase the ambient temperature of the data room to decrease cooling costs. However, increasing ambient temperature reduces the safety margins and can result in a higher number of anomalous events. Anomalies in the data centre need to be detected as soon as possible to optimize cooling efficiency and mitigate the harmful effects over servers. This article proposes the usage of clustering-based outlier detection techniques coupled with a trust and reputation system engine to detect anomalies in data centres. We show how self-organizing maps or growing neural gas can be applied to detect cooling and workload anomalies, respectively, in a real data centre scenario with very good detection and isolation rates, in a way that is robust to the malfunction of the sensors that gather server and environmental information
    corecore