3,192 research outputs found

    Supporting high penetrations of renewable generation via implementation of real-time electricity pricing and demand response

    Get PDF
    The rollout of smart meters raises the prospect that domestic customer electrical demand can be responsive to changes in supply capacity. Such responsive demand will become increasingly relevant in electrical power systems, as the proportion of weather-dependent renewable generation increases, due to the difficulty and expense of storing electrical energy. One method of providing response is to allow direct control of customer devices by network operators, as in the UK 'Economy 7' and 'White Meter' schemes used to control domestic electrical heating. However, such direct control is much less acceptable for loads such as washing machines, lighting and televisions. This study instead examines the use of real-time pricing of electricity in the domestic sector. This allows customers to be flexible but, importantly, to retain overall control. A simulation methodology for highlighting the potential effects of, and possible problems with, a national implementation of real-time pricing in the UK domestic electricity market is presented. This is done by disaggregating domestic load profiles and then simulating price-based elastic and load-shifting responses. Analysis of a future UK scenario with 15 GW wind penetration shows that during low-wind events, UK peak demand could be reduced by 8-11 GW. This could remove the requirement for 8-11 GW of standby generation with a capital cost of ÂŁ2.6 to ÂŁ3.6 billion. Recommended further work is the investigation of improved demand-forecasting and the price-setting strategies. This is a fine balance between giving customers access to plentiful, cheap energy when it is available, but increasing prices just enough to reduce demand to meet the supply capacity when this capacity is limited

    Models in the Cloud: Exploring Next Generation Environmental Software Systems

    Get PDF
    There is growing interest in the application of the latest trends in computing and data science methods to improve environmental science. However we found the penetration of best practice from computing domains such as software engineering and cloud computing into supporting every day environmental science to be poor. We take from this work a real need to re-evaluate the complexity of software tools and bring these to the right level of abstraction for environmental scientists to be able to leverage the latest developments in computing. In the Models in the Cloud project, we look at the role of model driven engineering, software frameworks and cloud computing in achieving this abstraction. As a case study we deployed a complex weather model to the cloud and developed a collaborative notebook interface for orchestrating the deployment and analysis of results. We navigate relatively poor support for complex high performance computing in the cloud to develop abstractions from complexity in cloud deployment and model configuration. We found great potential in cloud computing to transform science by enabling models to leverage elastic, flexible computing infrastructure and support new ways to deliver collaborative and open science

    Data analytics 2016: proceedings of the fifth international conference on data analytics

    Get PDF

    InterCloud: Utility-Oriented Federation of Cloud Computing Environments for Scaling of Application Services

    Full text link
    Cloud computing providers have setup several data centers at different geographical locations over the Internet in order to optimally serve needs of their customers around the world. However, existing systems do not support mechanisms and policies for dynamically coordinating load distribution among different Cloud-based data centers in order to determine optimal location for hosting application services to achieve reasonable QoS levels. Further, the Cloud computing providers are unable to predict geographic distribution of users consuming their services, hence the load coordination must happen automatically, and distribution of services must change in response to changes in the load. To counter this problem, we advocate creation of federated Cloud computing environment (InterCloud) that facilitates just-in-time, opportunistic, and scalable provisioning of application services, consistently achieving QoS targets under variable workload, resource and network conditions. The overall goal is to create a computing environment that supports dynamic expansion or contraction of capabilities (VMs, services, storage, and database) for handling sudden variations in service demands. This paper presents vision, challenges, and architectural elements of InterCloud for utility-oriented federation of Cloud computing environments. The proposed InterCloud environment supports scaling of applications across multiple vendor clouds. We have validated our approach by conducting a set of rigorous performance evaluation study using the CloudSim toolkit. The results demonstrate that federated Cloud computing model has immense potential as it offers significant performance gains as regards to response time and cost saving under dynamic workload scenarios.Comment: 20 pages, 4 figures, 3 tables, conference pape
    • …
    corecore