1,134 research outputs found

    Multisite Weather Generators Using Bayesian Networks: An Illustrative Case Study for Precipitation Occurrence

    Get PDF
    ABSTRACT: Many existing approaches for multisite weather generation try to capture several statistics of the observed data (e.g. pairwise correlations) in order to generate spatially and temporarily consistent series. In this work we analyse the application of Bayesian networks to this problem, focusing on precipitation occurrence and considering a simple case study to illustrate the potential of this new approach. We use Bayesian networks to approximate the multi-variate (-site) probability distribution of observed gauge data, which is factorized according to the relevant (marginal and conditional) dependencies. This factorization allows the simulation of synthetic samples from the multivariate distribution, thus providing a sound and promising methodology for multisite precipitation series generation.We acknowledge funding provided by the project MULTI‐SDM (CGL2015‐ 66583‐R, MINECO/FEDER)

    Avances en la regionalización estadística de escenarios de cambio climático para precipitación basados en técnicas de aprendizaje automático

    Get PDF
    A pesar de ser la principal herramienta para estudiar el cambio climático, los modelos globales de clima (GCM) siguen teniendo una resolución espacial limitada y presentan errores sistemáticos considerables con respecto al clima observado. La regionalización estadística pretende resolver este problema aprendiendo relaciones empíricas entre variables de larga escala, bien reproducidas por los GCM (por ejemplo, los vientos sinópticos o el geopotencial), y observaciones locales de la variable en superficie de interés, como la precipitación, objeto de esta tesis. Proponemos una serie de desarrollos novedosos que permiten mejorar la consistencia de los campos regionalizados y producir escenarios regionales plausibles de cambio climático. Los resultados de esta tesis tienen importantes implicaciones para los diferentes sectores que necesitan información fiable de precipitación para llevar a cabo sus evaluaciones de impactos.Even though they are the main tool to study climate change, global climate models (GCMs) still have a limited spatial resolution and exhibit considerable systematic errors with respect to the observed climate. Statistical downscaling aims to solve this issue by learning empirical relationships between large-scale variables, well reproduced by GCMs (such as synoptic winds or geopotential), and local observations of the target surface variable, such as precipitation, the focus of this thesis. We propose a series of novel developments which allow for improving the consistency of the downscaled fields and producing plausible local-to-regional climate change scenarios. The results of this thesis have important implications for the different sectors in need of reliable precipitation information to undertake their impact assessments

    Managing Expertise in a Distributed Environment

    Get PDF
    Expertise is the primary resource and product of professional service and technical firms. These firms often organize around project teams that advise and work under contract for clients. A key problem for management is to deploy expertise in project teams so as to meet the expertise requirements of projects and clients. Because expertise may be geographically distributed across multiple sites, many of these firms create virtual or distributed teams. Doing so gives these firms access to a larger pool of knowledge resources than would be available at one site and helps leverage expertise across the organization. However, geographically distributed collaboration in teams incurs coordination and other costs that local work does not. Is a distributed team worth these costs? We studied a professional service firm with distributed and collocated project teams. In this firm, domain expertise tended to be concentrated within geographic sites, whereas methodological expertise was distributed across the firm. We examined whether a better match of domain and methodological expertise to the needs of projects resulted in more profitable projects, and whether distributed teams matched these two types of expertise to the requirements of projects as well as or better than did collocated teams. We found that most projects were collocated, with members drawn from one site who had domain expertise that matched project requirements as well as when members were drawn from other sites. The profits of projects were unrelated to the match of domain expertise with project requirements. However, project profits were significantly and positively related to a match of methodological expertise with project requirements. Furthermore, distributed projects showed a stronger match of methodological expertise with project requirements than did collocated projects, and predicted disproportionately more profits. We conclude that an appropriate utilization of organizationally distributed expertise has a positive impact on project performance

    Dynamic workflow management for large scale scientific applications

    Get PDF
    The increasing computational and data requirements of scientific applications have made the usage of large clustered systems as well as distributed resources inevitable. Although executing large applications in these environments brings increased performance, the automation of the process becomes more and more challenging. The use of complex workflow management systems has been a viable solution for this automation process. In this thesis, we study a broad range of workflow management tools and compare their capabilities especially in terms of dynamic and conditional structures they support, which are crucial for the automation of complex applications. We then apply some of these tools to two real-life scientific applications: i) simulation of DNA folding, and ii) reservoir uncertainty analysis. Our implementation is based on Pegasus workflow planning tool, DAGMan workflow execution system, Condor-G computational scheduler, and Stork data scheduler. The designed abstract workflows are converted to concrete workflows using Pegasus where jobs are matched to resources; DAGMan makes sure these jobs execute reliably and in the correct order on the remote resources; Condor-G performs the scheduling for the computational tasks and Stork optimizes the data movement between different components. Integrated solution with these tools allows automation of large scale applications, as well as providing complete reliability and efficiency in executing complex workflows. We have also developed a new site selection mechanism on top of these systems, which can choose the most available computing resources for the submission of the tasks. The details of our design and implementation, as well as experimental results are presented

    A multi-site methodology for understanding dependencies in flood risk exposure in the UK

    Get PDF
    PhD ThesisRecent large scale flood events in the UK and the continued threat of a major North Sea surge have motivated a re-appraisal of how flood risk is modelled. A new generation of flood risk models are starting to consider the spatial and temporal dependencies in flood events. This is important for a wide range of risk based decision making, with one of its most significant applications being the understanding of insurance exposure. The aim of this thesis is to increase understanding of flood risk exposure in the UK and identify areas where existing modelling capabilities and data limitations contribute to large uncertainties in the estimation of risk. Illustrating a successful collaboration between academia and the insurance industry, a case study of one company’s exposure from static caravans is used to develop a methodology for flood risk assessment at multiple sites nested within a national framework. This novel nested approach allows for greater detail to be included at sites of interest resulting in increased understanding of the risk driving processes while retaining the large scale dependence structure. This is demonstrated at high risk locations on the Lincolnshire and North Wales coastline and inland on the Rivers Severn and Thames. The proposed methodology takes a flexible component based approach and has potential adaptations to different receptors and end users. A systems based model is used which explicitly considers all key components of risk. Extreme fluvial and coastal events are modelled statistically using the conditional dependence model of Heffernan and Tawn (2004). Coastal flood defences are essential for the protection of static caravan sites however their inclusion in existing risk models contributes significant uncertainties. The quality of data available on flood defence heights is reviewed and a methodology to incorporate spatial variations is proposed. The failure of flood defences is modelled using fragility curves and inundation modelling is used to route water on the floodplain. Finally the damage to the static caravans is modelled using depth-damage curves with reference to the impact of limited observed data on flood damage for caravans. One of the biggest challenges of considering dependencies across multiple scales within a systems model is matching the data requirements across each component. To address this problem this thesis investigates the relationship between skew surge and wave height to estimate the total inshore water level, and develops a UK specific method to transform daily mean flow to peak flow. The modular structure of the proposed methodology means different component models can be used to suit the available data; here the integration of both 1D and 2D floodplain inundation models is demonstrated.EPSR

    Managing software development information in global configuration management activities

    Get PDF
    Software Configuration Management (SCM) techniques have been considered the entry point to rigorous software engineering, where multiple organizations cooperate in a decentralized mode to save resources, ensure the quality of the diversity of software products, and manage corporate information to get a better return of investment. The incessant trend of Global Software Development (GSD) and the complexity of implementing a correct SCM solution grow not only because of the changing circumstances, but also because of the interactions and the forces related to GSD activities. This paper addresses the role SCM plays in the development of commercial products and systems, and introduces a SCM reference model to describe the relationships between the different technical, organizational, and product concerns any software development company should support in the global market

    Multisite adaptive computation offloading for mobile cloud applications

    Get PDF
    The sheer amount of mobile devices and their fast adaptability have contributed to the proliferation of modern advanced mobile applications. These applications have characteristics such as latency-critical and demand high availability. Also, these kinds of applications often require intensive computation resources and excessive energy consumption for processing, a mobile device has limited computation and energy capacity because of the physical size constraints. The heterogeneous mobile cloud environment consists of different computing resources such as remote cloud servers in faraway data centres, cloudlets whose goal is to bring the cloud closer to the users, and nearby mobile devices that can be utilised to offload mobile tasks. Heterogeneity in mobile devices and the different sites include software, hardware, and technology variations. Resource-constrained mobile devices can leverage the shared resource environment to offload their intensive tasks to conserve battery life and improve the overall application performance. However, with such a loosely coupled and mobile device dominating network, new challenges and problems such as how to seamlessly leverage mobile devices with all the offloading sites, how to simplify deploying runtime environment for serving offloading requests from mobile devices, how to identify which parts of the mobile application to offload and how to decide whether to offload them and how to select the most optimal candidate offloading site among others. To overcome the aforementioned challenges, this research work contributes the design and implementation of MAMoC, a loosely coupled end-to-end mobile computation offloading framework. Mobile applications can be adapted to the client library of the framework while the server components are deployed to the offloading sites for serving offloading requests. The evaluation of the offloading decision engine demonstrates the viability of the proposed solution for managing seamless and transparent offloading in distributed and dynamic mobile cloud environments. All the implemented components of this work are publicly available at the following URL: https://github.com/mamoc-repo

    A multi‐scale framework for flood risk analysis at spatially distributed locations

    Get PDF
    This paper presents a multi‐scale framework for flood risk analysis from fluvial and coastal sources at broad (including national) scales. The framework combines an extreme value spatial model of fluvial and coastal flood hazards using the Heffernan and Tawn conditional dependence model, with a new Markov approach to representing the spatial variability of flood defences. The nested multi‐scale structure enables spatial and temporal dependence at a national scale to be combined with detailed local analysis of inundation and damage. By explicitly considering each stage of the process, potential uncertainties in the risk estimate are identified and can be communicated to end users to encourage informed decision making. The framework is demonstrated by application to an insurance portfolio of static caravan sites across the UK worth over £2bn. In the case study, the largest uncertainties are shown to derive from the spatial structure used in the statistical model and limited data on flood defences and receptor vulnerability

    Ground Systems Development Environment (GSDE) interface requirements and prototyping plan

    Get PDF
    This report describes the data collection and requirements analysis effort of the Ground System Development Environment (GSDE) Interface Requirements study. It identifies potential problems in the interfaces among applications and processors in the heterogeneous systems that comprises the GSDE. It describes possible strategies for addressing those problems. It also identifies areas for further research and prototyping to demonstrate the capabilities and feasibility of those strategies and defines a plan for building the necessary software prototypes
    corecore