675 research outputs found

    Air pollution modelling using a graphics processing unit with CUDA

    Get PDF
    The Graphics Processing Unit (GPU) is a powerful tool for parallel computing. In the past years the performance and capabilities of GPUs have increased, and the Compute Unified Device Architecture (CUDA) - a parallel computing architecture - has been developed by NVIDIA to utilize this performance in general purpose computations. Here we show for the first time a possible application of GPU for environmental studies serving as a basement for decision making strategies. A stochastic Lagrangian particle model has been developed on CUDA to estimate the transport and the transformation of the radionuclides from a single point source during an accidental release. Our results show that parallel implementation achieves typical acceleration values in the order of 80-120 times compared to CPU using a single-threaded implementation on a 2.33 GHz desktop computer. Only very small differences have been found between the results obtained from GPU and CPU simulations, which are comparable with the effect of stochastic transport phenomena in atmosphere. The relatively high speedup with no additional costs to maintain this parallel architecture could result in a wide usage of GPU for diversified environmental applications in the near future.Comment: 5 figure

    Monte Carlo Algorithms for Linear Problems

    Get PDF
    MSC Subject Classification: 65C05, 65U05.Monte Carlo methods are a powerful tool in many fields of mathematics, physics and engineering. It is known, that these methods give statistical estimates for the functional of the solution by performing random sampling of a certain chance variable whose mathematical expectation is the desired functional. Monte Carlo methods are methods for solving problems using random variables. In the book [16] edited by Yu. A. Shreider one can find the following definition of the Monte Carlo method

    PARALLEL COMPUTATIONS WITH LARGE-SCALE AIR\ud POLLUTION MODELS

    Get PDF
    Large-scale mathematical models are very powerful tools in the efforts to provide more\ud information and more detailed information about the pollution levels, especially about pollution\ud levels which exceed certain critical values.. However, the model used must satisfy at\ud least two conditions: (i) it must be verified that the model results are reliable and (ii) it\ud should be possible to carry out different study by using the model. It is clear that comprehensive\ud studies about relationships between different input parameters and the model results\ud can only be carried out (a) if the numerical methods used in the model are sufficiently\ud fast and (b) if the code runs efficiently on the available high-speed computers.\ud Some results obtained recently by a new unified version of the Danish Eulerian Model will\ud be presented in this paper

    Assimilation of OMI NO<sub>2</sub> retrievals into the limited-area chemistry-transport model DEHM (V2009.0) with a 3-D OI algorithm

    Get PDF
    Data assimilation is the process of combining real-world observations with a modelled geophysical field. The increasing abundance of satellite retrievals of atmospheric trace gases makes chemical data assimilation an increasingly viable method for deriving more accurate analysed fields and initial conditions for air quality forecasts. We implemented a three-dimensional optimal interpolation (OI) scheme to assimilate retrievals of NO2 tropospheric columns from the Ozone Monitoring Instrument into the Danish Eulerian Hemispheric Model (DEHM, version V2009.0), a three-dimensional, regional-scale, offline chemistry-transport model. The background error covariance matrix, B, was estimated based on differences in the NO2 concentration field between paired simulations using different meteorological inputs. Background error correlations were modelled as non-separable, horizontally homogeneous and isotropic. Parameters were estimated for each month and for each hour to allow for seasonal and diurnal patterns in NO2 concentrations. Three experiments were run to compare the effects of observation thinning and the choice of observation errors. Model performance was assessed by comparing the analysed fields to an independent set of observations: ground-based measurements from European air-quality monitoring stations. The analysed NO2 and O3 concentrations were more accurate than those from a reference simulation without assimilation, with increased temporal correlation for both species. Thinning of satellite data and the use of constant observation errors yielded a better balance between the observed increments and the prescribed error covariances, with no appreciable degradation in the surface concentrations due to the observation thinning. Forecasts were also considered and these showed rather limited influence from the initial conditions once the effects of the diurnal cycle are accounted for. The simple OI scheme was effective and computationally feasible in this context, where only a single species was assimilated, adjusting the three-dimensional field for this compound. Limitations of the assimilation scheme are discussed

    Short-term fire front spread prediction using inverse modelling and airborne infrared images

    Get PDF
    A wildfire forecasting tool capable of estimating the fire perimeter position sufficiently in advance of the actual fire arrival will assist firefighting operations and optimise available resources. However, owing to limited knowledge of fire event characteristics (e.g. fuel distribution and characteristics, weather variability) and the short time available to deliver a forecast, most of the current models only provide a rough approximation of the forthcoming fire positions and dynamics. The problem can be tackled by coupling data assimilation and inverse modelling techniques. We present an inverse modelling-based algorithm that uses infrared airborne images to forecast short-term wildfire dynamics with a positive lead time. The algorithm is applied to two real-scale mallee-heath shrubland fire experiments, of 9 and 25 ha, successfully forecasting the fire perimeter shape and position in the short term. Forecast dependency on the assimilation windows is explored to prepare the system to meet real scenario constraints. It is envisaged the system will be applied at larger time and space scales.Peer ReviewedPostprint (author's final draft

    The ESCAPE project : Energy-efficient Scalable Algorithms for Weather Prediction at Exascale

    Get PDF
    In the simulation of complex multi-scale flows arising in weather and climate modelling, one of the biggest challenges is to satisfy strict service requirements in terms of time to solution and to satisfy budgetary constraints in terms of energy to solution, without compromising the accuracy and stability of the application. These simulations require algorithms that minimise the energy footprint along with the time required to produce a solution, maintain the physically required level of accuracy, are numerically stable, and are resilient in case of hardware failure. The European Centre for Medium-Range Weather Forecasts (ECMWF) led the ESCAPE (Energy-efficient Scalable Algorithms for Weather Prediction at Exascale) project, funded by Horizon 2020 (H2020) under the FET-HPC (Future and Emerging Technologies in High Performance Computing) initiative. The goal of ESCAPE was to develop a sustainable strategy to evolve weather and climate prediction models to next-generation computing technologies. The project partners incorporate the expertise of leading European regional forecasting consortia, university research, experienced high-performance computing centres, and hardware vendors. This paper presents an overview of the ESCAPE strategy: (i) identify domain-specific key algorithmic motifs in weather prediction and climate models (which we term Weather & Climate Dwarfs), (ii) categorise them in terms of computational and communication patterns while (iii) adapting them to different hardware architectures with alternative programming models, (iv) analyse the challenges in optimising, and (v) find alternative algorithms for the same scheme. The participating weather prediction models are the following: IFS (Integrated Forecasting System); ALARO, a combination of AROME (Application de la Recherche a l'Operationnel a Meso-Echelle) and ALADIN (Aire Limitee Adaptation Dynamique Developpement International); and COSMO-EULAG, a combination of COSMO (Consortium for Small-scale Modeling) and EULAG (Eulerian and semi-Lagrangian fluid solver). For many of the weather and climate dwarfs ESCAPE provides prototype implementations on different hardware architectures (mainly Intel Skylake CPUs, NVIDIA GPUs, Intel Xeon Phi, Optalysys optical processor) with different programming models. The spectral transform dwarf represents a detailed example of the co-design cycle of an ESCAPE dwarf. The dwarf concept has proven to be extremely useful for the rapid prototyping of alternative algorithms and their interaction with hardware; e.g. the use of a domain-specific language (DSL). Manual adaptations have led to substantial accelerations of key algorithms in numerical weather prediction (NWP) but are not a general recipe for the performance portability of complex NWP models. Existing DSLs are found to require further evolution but are promising tools for achieving the latter. Measurements of energy and time to solution suggest that a future focus needs to be on exploiting the simultaneous use of all available resources in hybrid CPU-GPU arrangements
    corecore