12 research outputs found

    Adaptation of NEMO-LIM3 model for multigrid high resolution Arctic simulation

    Full text link
    High-resolution regional hindcasting of ocean and sea ice plays an important role in the assessment of shipping and operational risks in the Arctic Ocean. The ice-ocean model NEMO-LIM3 was modified to improve its simulation quality for appropriate spatio-temporal resolutions. A multigrid model setup with connected coarse- (14 km) and fine-resolution (5 km) model configurations was devised. These two configurations were implemented and run separately. The resulting computational cost was lower when compared to that of the built-in AGRIF nesting system. Ice and tracer boundary-condition schemes were modified to achieve the correct interaction between coarse- and fine grids through a long ice-covered open boundary. An ice-restoring scheme was implemented to reduce spin-up time. The NEMO-LIM3 configuration described in this article provides more flexible and customisable tools for high-resolution regional Arctic simulations

    Experiences with ESM Multi-model Ensembles for Educational Purposes: A report from the use of D3.1 for the 3rdE2SCMS (D3.4)

    Get PDF
    Summary This document describes experiences from three summer schools, especially from the most recent one in 2016, on modelling the Earth System with a multi-model ensemble. These models have been run by the student participants at different HPC sites solving scientific assignments. The experiences of the technical teams preparing the models for these experiments shed light on the usability of the models from different perspectives. One of the aspects of particular importance for ESiWACE is performance portability, which will be a focus for the WP in the next reporting period

    Special Aspects of Wind Wave Simulations for Surge Flood Forecasting and Prevention

    Get PDF
    AbstractThe paper is focused on several issues of wind wave simulations with SWAN model for the tasks related to prevention of surge floods in St. Petersburg. It introduces main objectives that are pursued through the use of the model as well as covers problems of computational mesh generation and model parameter calibration. We also examined several assumptions on the necessity to take ice fraction and sea level rise into account in wind wave simulations

    ISAAC - In Situ Animation of Accelerated Computations

    No full text
    Many computations like physics or biologists simulations these days run on accelerated hardware like CUDA GPUs or Intel Xeon Phi, which are itself distributed in a big compute cluster communicating over MPI. The goal of ISAAC is to visualize this data without the need to download it to the host while using the high computation speed of the accelerator

    ISAAC - In Situ Animation of Accelerated Computations

    No full text
    Many computations like physics or biologists simulations these days run on accelerated hardware like CUDA GPUs or Intel Xeon Phi, which are itself distributed in a big compute cluster communicating over MPI. The goal of ISAAC is to visualize this data without the need to download it to the host while using the high computation speed of the accelerator

    ICON-Sapphire: simulating the components of the Earth system and their interactions at kilometer and subkilometer scales

    No full text
    International audienceState-of-the-art Earth system models typically employ grid spacings of O(100 km), which is too coarse to explicitly resolve main drivers of the flow of energy and matter across the Earth system. In this paper, we present the new ICON-Sapphire model configuration, which targets a representation of the components of the Earth system and their interactions with a grid spacing of 10 km and finer. Through the use of selected simulation examples, we demonstrate that ICON-Sapphire can (i) be run coupled globally on seasonal timescales with a grid spacing of 5 km, on monthly timescales with a grid spacing of 2.5 km, and on daily timescales with a grid spacing of 1.25 km; (ii) resolve large eddies in the atmosphere using hectometer grid spacings on limited-area domains in atmosphere-only simulations; (iii) resolve submesoscale ocean eddies by using a global uniform grid of 1.25 km or a telescoping grid with the finest grid spacing at 530 m, the latter coupled to a uniform atmosphere; and (iv) simulate biogeochemistry in an ocean-only simulation integrated for 4 years at 10 km. Comparison of basic features of the climate system to observations reveals no obvious pitfalls, even though some observed aspects remain difficult to capture. The throughput of the coupled 5 km global simulation is 126 simulated days per day employing 21 % of the latest machine of the German Climate Computing Center. Extrapolating from these results, multi-decadal global simulations including interactive carbon are now possible, and short global simulations resolving large eddies in the atmosphere and submesoscale eddies in the ocean are within reach

    ICON-Sapphire: simulating the components of the Earth system and their interactions at kilometer and subkilometer scales

    No full text
    International audienceState-of-the-art Earth system models typically employ grid spacings of O(100 km), which is too coarse to explicitly resolve main drivers of the flow of energy and matter across the Earth system. In this paper, we present the new ICON-Sapphire model configuration, which targets a representation of the components of the Earth system and their interactions with a grid spacing of 10 km and finer. Through the use of selected simulation examples, we demonstrate that ICON-Sapphire can (i) be run coupled globally on seasonal timescales with a grid spacing of 5 km, on monthly timescales with a grid spacing of 2.5 km, and on daily timescales with a grid spacing of 1.25 km; (ii) resolve large eddies in the atmosphere using hectometer grid spacings on limited-area domains in atmosphere-only simulations; (iii) resolve submesoscale ocean eddies by using a global uniform grid of 1.25 km or a telescoping grid with the finest grid spacing at 530 m, the latter coupled to a uniform atmosphere; and (iv) simulate biogeochemistry in an ocean-only simulation integrated for 4 years at 10 km. Comparison of basic features of the climate system to observations reveals no obvious pitfalls, even though some observed aspects remain difficult to capture. The throughput of the coupled 5 km global simulation is 126 simulated days per day employing 21 % of the latest machine of the German Climate Computing Center. Extrapolating from these results, multi-decadal global simulations including interactive carbon are now possible, and short global simulations resolving large eddies in the atmosphere and submesoscale eddies in the ocean are within reach
    corecore