9 research outputs found

    LFRic: meeting the challenges of scalability and performance portability in weather and climate models

    Get PDF
    This paper describes LFRic: the new weather and climate modelling system being developed by the UK Met Office to replace the existing Unified Model in preparation for exascale computing in the 2020s. LFRic uses the GungHo dynamical core and runs on a semi-structured cubed-sphere mesh. The design of the supporting infrastructure follows object-oriented principles to facilitate modularity and the use of external libraries where possible. In particular, a `separation of concerns' between the science code and parallel code is imposed to promote performance portability. An application called PSyclone, developed at the STFC Hartree centre, can generate the parallel code enabling deployment of a single source science code onto different machine architectures. This paper provides an overview of the scientific requirement, the design of the software infrastructure, and examples of PSyclone usage. Preliminary performance results show strong scaling and an indication that hybrid MPI/OpenMP performs better than pure MPI

    Assessing the impacts of biodiversity offset policies

    No full text
    In response to the increasing loss of native vegetation and biodiversity, a growing number of countries have adopted 'offsetting' policies that seek to balance local habitat destruction by restoring, enhancing and/or protecting similar but separate habitat. Although these policies often have a stated aim of producing a 'net gain' or 'no net loss' in environmental benefits, it is challenging to determine the potential impacts of a policy and if, or when, it will achieve its objectives. In this paper we address these questions with a general approach that uses predictive modelling under uncertainty to quantify the ecological impacts of different offset policies. This is demonstrated with a case study to the west of Melbourne, Australia where a proposed expansion of Melbourne's urban growth boundary would result in a loss of endangered native grassland, requiring offsets to be implemented as compensation. Three different offset policies were modelled: i) no restrictions on offset location, ii) offset locations spatially restricted to a strategically defined area and iii) offset locations spatially and temporally restricted, requiring all offsets to be implemented before commencing development. The ecological impact of the policies was determined with a system model that predicts future changes in the extent and condition of native grassland

    Improved survival in esophageal cancer in the period 1978 to 1983.

    No full text

    Crossing the chasm: how to develop weather and climate models for next generation computers?

    Get PDF
    Weather and climate models are complex pieces of software which include many individual components, each of which is evolving under pressure to exploit advances in computing to enhance some combination of a range of possible improvements (higher spatio-temporal resolution, increased fidelity in terms of resolved processes, more quantification of uncertainty, etc.). However, after many years of a relatively stable computing environment with little choice in processing architecture or programming paradigm (basically X86 processors using MPI for parallelism), the existing menu of processor choices includes significant diversity, and more is on the horizon. This computational diversity, coupled with ever increasing software complexity, leads to the very real possibility that weather and climate modelling will arrive at a chasm which will separate scientific aspiration from our ability to develop and/or rapidly adapt codes to the available hardware. In this paper we review the hardware and software trends which are leading us towards this chasm, before describing current progress in addressing some of the tools which we may be able to use to bridge the chasm. This brief introduction to current tools and plans is followed by a discussion outlining the scientific requirements for quality model codes which have satisfactory performance and portability, while simultaneously supporting productive scientific evolution. We assert that the existing method of incremental model improvements employing small steps which adjust to the changing hardware environment is likely to be inadequate for crossing the chasm between aspiration and hardware at a satisfactory pace, in part because institutions cannot have all the relevant expertise in house. Instead, we outline a methodology based on large community efforts in engineering and standardisation, which will depend on identifying a taxonomy of key activities – perhaps based on existing efforts to develop domain-specific languages, identify common patterns in weather and climate codes, and develop community approaches to commonly needed tools and libraries – and then collaboratively building up those key components. Such a collaborative approach will depend on institutions, projects, and individuals adopting new interdependencies and ways of working.ISSN:1991-9603ISSN:1991-959
    corecore