9 research outputs found

    Improving transfer functions to describe radiocesium wash-off fluxes for the Niida River by a Bayesian approach

    No full text
    International audienceThis paper proposed methodological refinements of the generic transfer function approach to reconstruct radiocesium wash-off fluxes from contaminated catchments, by the integration of hydrological descriptors (passed volume of water, flow rate fluctuations and antecedent flow conditions). The approach was applied to the Niida River (Fukushima prefecture, Japan) for the period 03/2011-03/2015, for which daily flow rate (m3/s) and infrequent total radiocesium concentration (Bq/L) values were available from literature. Three models were defined, generic TF (Φ0), flow-corrected time variant (Φ1) and antecedent-flow corrected variant (Φ2). Calibration of these models’ parameters was performed with a Bayesian approach because it is particularly adapted to limited datasets and censored information, and it provides parameters distributions. The model selection showed strong evidence of model Φ2 (indicated by marginal likelihood), which integrates current and recent hydrology in its formulation, and lower prediction errors (indicated by RMSE and ME). Models Φ1 and Φ2 better described wash-off dynamics compared to model Φ0, due to the inclusion of one or several hydrological descriptors. From March 2011 to March 2015, model Φ2 estimated 137Cs export from Niida catchment between 0.32 and 0.67 TBq, with a median value of 0.49 TBq, which represents around 0.27% of the initial fallout and could represent a significant source-term to the Ocean compared to the direct release from Fukushima Dai-ichi Nuclear Power Plant (FDNPP). Moreover the remaining 99% of the initial radiocesium fallout within the catchment may constitute a persistent contamination source for wash-off. Although the proposed methodology brought improvements in the assessment of wash-off fluxes, it remains an empirical interpolation method with a limited predictive power, particularly for recent low activities. To improve predictions, modelling approaches require more observed data (particularly more activity values corresponding to more hydrological conditions), and the inclusion of more hydrological descriptors. © 2016 Elsevier Lt

    Uncertainty analysis in post-accidental risk assessment models An application to the Fukushima accident

    No full text
    International audienceEnvironmental contamination subsequent to the atmospheric releases during the Fukushima accident resulted in high radioactive concentrations in feed and foodstuffs. Producing a realistic health risk assessment after severe nuclear accidents, and developing a sufficient understanding of environmental transfer and exposure processes, appears to be a research priority. Specifically, the characterization of uncertainties in the human ingestion pathway, as outlined by the radioecological community, is of great interest. The present work aims to (i) characterize spatial variability and parametric uncertainties raised by the processes involved in the transfer of radionuclides (134Cs and 137Cs) after atmospheric releases during the Fukushima accident into the terrestrial ecosystems, and (ii) study the impact of these variability and uncertainties on radioactive contamination of leafy vegetables. The implemented approach quantified uncertainties under a probabilistic modelling framework. This resulted in probability distributions derived mainly from Bayesian inference and by performing transfer calculations in the modelling platform SYMBIOSE. © 2015 Elsevier Ltd. All rights reserved

    Identifiability of sorption parameters in stirred flow-through reactor experiments and their identification with a Bayesian approach

    No full text
    This paper addresses the methodological conditions –particularly experimental design and statistical inference– ensuring the identifiability of sorption parameters from breakthrough curves measured during stirred flow-through reactor experiments also known as continuous flow stirred-tank reactor (CSTR) experiments. The equilibrium-kinetic (EK) sorption model was selected as nonequilibrium parameterization embedding the Kd approach. Parameter identifiability was studied formally on the equations governing outlet concentrations. It was also studied numerically on 6 simulated CSTR experiments on a soil with known equilibrium-kinetic sorption parameters. EK sorption parameters can not be identified from a single breakthrough curve of a CSTR experiment, because Kd,1 and k− were diagnosed collinear. For pairs of CSTR experiments, Bayesian inference allowed to select the correct models of sorption and error among sorption alternatives. Bayesian inference was conducted with SAMCAT software (Sensitivity Analysis and Markov Chain simulations Applied to Transfer models) which launched the simulations through the embedded simulation engine GNU-MCSim, and automated their configuration and post-processing. Experimental designs consisting in varying flow rates between experiments reaching equilibrium at contamination stage were found optimal, because they simultaneously gave accurate sorption parameters and predictions. Bayesian results were comparable to maximum likehood method but they avoided convergence problems, the marginal likelihood allowed to compare all models, and credible interval gave directly the uncertainty of sorption parameters θ. Although these findings are limited to the specific conditions studied here, in particular the considered sorption model, the chosen parameter values and error structure, they help in the conception and analysis of future CSTR experiments with radionuclides whose kinetic behaviour is suspected

    Systematic influences of gamma-ray spectrometry data near the decision threshold for radioactivity measurements in the environment

    No full text
    International audienceSeveral methods for reporting outcomes of gamma-ray spectrometric measurements of environmental samples for dose calculations are presented and discussed. The measurement outcomes can be reported as primary measurement results, primary measurement results modified according to the quantification limit, best estimates obtained by the Bayesian posterior (ISO 11929), best estimates obtained by the probability density distribution resembling shifting, and the procedure recommended by the European Commission (EC). The annual dose is calculated from the arithmetic average using any of these five procedures. It was shown that the primary measurement results modified according to the quantification limit could lead to an underestimation of the annual dose. On the other hand the best estimates lead to an overestimation of the annual dose. The annual doses calculated from the measurement outcomes obtained according to the EC's recommended procedure, which does not cope with the uncertainties, fluctuate between an under- and overestimation, depending on the frequency of the measurement results that are larger than the limit of detection. In the extreme case, when no measurement results above the detection limit occur, the average over primary measurement results modified according to the quantification limit underestimates the average over primary measurement results for about 80%. The average over best estimates calculated according the procedure resembling shifting overestimates the average over primary measurement results for 35%, the average obtained by the Bayesian posterior for 85% and the treatment according to the EC recommendation for 89%. © 2016 Elsevier Ltd

    Evaluating variability and uncertainty in radiological impact assessment using SYMBIOSE

    No full text
    International audienceSYMBIOSE is a modelling platform that accounts for variability and uncertainty in radiological impact assessments, when simulating the environmental fate of radionuclides and assessing doses to human populations. The default database of SYMBIOSE is partly based on parameter values that are summarized within International Atomic Energy Agency (IAEA) documents. To characterize uncertainty on the transfer parameters, 331 Probability Distribution Functions (PDFs) were defined from the summary statistics provided within the IAEA documents (i.e. sample size, minimal and maximum values, arithmetic and geometric means, standard and geometric standard deviations) and are made available as spreadsheet files. The methods used to derive the PDFs without complete data sets, but merely the summary statistics, are presented. Then, a simple case-study illustrates the use of the database in a second-order Monte Carlo calculation, separating parametric uncertainty and inter-individual variability. © 2014 Elsevier Ltd

    Literaturverzeichnis

    No full text
    corecore