73 research outputs found

    Modelling transport and deposition of caesium and iodine from the Chernobyl accident using the DREAM model

    Get PDF
    International audienceA tracer model, DREAM (the Danish Rimpuff and Eulerian Accidental release Model), has been developed for modelling transport, dispersion and deposition (wet and dry) of radioactive material from accidental releases, as the Chernobyl accident. The model is a combination of a Lagrangian model, that includes the near source dispersion, and an Eulerian model describing the long-range transport. The performance of the transport model has previously been tested within the European Tracer Experiment, ETEX, which included transport and dispersion of an inert, non-depositing tracer from a controlled release. The focus of this paper is the model performance with respect to the deposition of 137Cs, 134Cs and 131I from the Chernobyl accident, using different relatively simple and comprehensive parameterizations. The performance, compared to measurements, of different combinations of parameterizations of wet and dry deposition schemes has been evaluated, using different statistical tests

    Subgrid-scale treatment for major point sources in an Eulerian model: A sensitivity study on the European Tracer Experiment (ETEX) and Chernobyl cases

    Get PDF
    International audienceWe investigate the plume-in-grid method for a subgrid-scale treatment of major point sources in the passive case. This method consists in an on-line coupling of a Gaussian pu model and an Eulerian model, which better represents the point emissions without signicantly increasing the computational burden. In this paper, the plume-in-grid model implemented on the Polyphemus air quality modeling system is described, with an emphasis on the parameterizations available for the Gaussian dispersion, and on the coupling with the Eulerian model. The study evaluates the model for passive tracers at continental scale with the ETEX experiment and the Chernobyl case. The aim is to (1) estimate the model sensitivity to the local-scale parameterizations, and (2) to bring insights on the spatial and temporal scales that are relevant in the use of a plume-in-grid model. It is found that the plume-in-grid treatment improves the vertical diusion at local-scale, thus reducing the bias -- especially at the closest stations. Doury's Gaussian parameterization and a column injection method give the best results. There is a strong sensitivity of the results to the injection time and the grid resolution. The "best" injection time actually depends on the resolution, but is difficult to determine a priori. The plume-in-grid method is also found to improve the results at ne resolutions more than with coarse grids, by compensating the Eulerian tendency to over-predict the concentrations at these resolutions

    Inverse modelling-based reconstruction of the Chernobyl source term available for long-range transport

    Get PDF
    International audienceThe reconstruction of the Chernobyl accident source term has been previously carried out using core inventories, but also back and forth confrontations between model simulations and activity concentration or deposited activity measurements. The approach presented in this paper is based on inverse modelling techniques. It relies both on the activity concentration measurements and on the adjoint of a chemistry-transport model. The location of the release is assumed to be known, and one is looking for a source term available for long-range transport that depends both on time and altitude. The method relies on the maximum entropy on the mean principle and exploits source positivity. The inversion results are mainly sensitive to two tuning parameters, a mass scale and the scale of the prior errors in the inversion. To overcome this hardship, we resort to the statistical L-curve method to estimate balanced values for these two parameters. Once this is done, many of the retrieved features of the source are robust within a reasonable range of parameter values. Our results favour the acknowledged three-step scenario, with a strong initial release (26 to 27 April), followed by a weak emission period of four days (28 April–1 May) and again a release, longer but less intense than the initial one (2 May–6 May). The retrieved quantities of iodine-131, caesium-134 and caesium-137 that have been released are in good agreement with the latest reported estimations. Yet, a stronger apportionment of the total released activity is ascribed to the first period and less to the third one. Finer chronological details are obtained, such as a sequence of eruptive episodes in the first two days, likely related to the modulation of the boundary layer diurnal cycle. In addition, the first two-day release surges are found to have effectively reached an altitude up to the top of the domain (5000 m)

    RODOS: decision support for nuclear emergencies

    Get PDF

    Estimation of the caesium-137 source term from the Fukushima Daiichi nuclear power plant using a consistent joint assimilation of air concentration and deposition observations

    Get PDF
    International audienceInverse modelling techniques can be used to estimate the amount of radionuclides and the temporal profile of the source term released in the atmosphere during the accident of the Fukushima Daiichi nuclear power plant in March 2011. In Winiarek et al. (2012b), the lower bounds of the caesium-137 and iodine-131 source terms were estimated with such techniques, using activity concentration measurements. The importance of an objective assessment of prior errors (the observation errors and the background errors) was emphasised for a reliable inversion. In such critical context where the meteorological conditions can make the source term partly unobservable and where only a few observations are available, such prior estimation techniques are mandatory, the retrieved source term being very sensitive to this estimation. We propose to extend the use of these techniques to the estimation of prior errors when assimilating observations from several data sets. The aim is to compute an estimate of the caesium-137 source term jointly using all available data about this radionuclide, such as activity concentrations in the air, but also daily fallout measurements and total cumulated fallout measurements. It is crucial to properly and simultaneously estimate the background errors and the prior errors relative to each data set. A proper estimation of prior errors is also a necessary condition to reliably estimate the a posteriori uncertainty of the estimated source term. Using such techniques, we retrieve a total released quantity of caesium-137 in the interval 11.6 − 19.3 PBq with an estimated standard deviation range of 15 − 20% depending on the method and the data sets. The "blind" time intervals of the source term have also been strongly mitigated compared to the first estimations with only activity concentration data

    Air pollution modelling using a graphics processing unit with CUDA

    Get PDF
    The Graphics Processing Unit (GPU) is a powerful tool for parallel computing. In the past years the performance and capabilities of GPUs have increased, and the Compute Unified Device Architecture (CUDA) - a parallel computing architecture - has been developed by NVIDIA to utilize this performance in general purpose computations. Here we show for the first time a possible application of GPU for environmental studies serving as a basement for decision making strategies. A stochastic Lagrangian particle model has been developed on CUDA to estimate the transport and the transformation of the radionuclides from a single point source during an accidental release. Our results show that parallel implementation achieves typical acceleration values in the order of 80-120 times compared to CPU using a single-threaded implementation on a 2.33 GHz desktop computer. Only very small differences have been found between the results obtained from GPU and CPU simulations, which are comparable with the effect of stochastic transport phenomena in atmosphere. The relatively high speedup with no additional costs to maintain this parallel architecture could result in a wide usage of GPU for diversified environmental applications in the near future.Comment: 5 figure
    • …
    corecore