147 research outputs found

    Adoption des tic, proximité et diffusion localisée des connaissances

    Get PDF
    International audienceWe use a specially designed survey on French firms to provide empirical evidence suggesting that IT adoption is not only influenced by the traditional factors of technology diffusion (rank, stock-order, epidemic effects and organizational practices) but also by localized knowledge spillovers. We make several advances. Firstly, we study the adoption of authentic Information and Communication Technologies while the recent empirical literature has been mainly focused on computer capital stocksor automation tools. Secondly, we construct measures to replace the traditional epidemic effect by different proximity variables. Thirdly, we examine different channels of knowledge transmission among nearby firms, from unintended knowledge spillovers to well-regulated arrangements. Our econometric methodology is designed to deal with potential biases that are encountered when implementing technology adoption equations. In particular, we explicitly deal with the problem ofsimultaneous technological choices, using bivariate adoption equations.Une enquête sur les entreprises de Haute-Savoie montre que l’adoption des TIC est influencée non seulement par les facteurs traditionnels de la diffusion technologique (effets rang, stock-ordre et épidémique, pratiques organisationnelles), mais aussi par des effets de diffusion localisée des connaissances. Les données permettent plusieurs avancées. Tout d’abord, nous étudions l’adoption de véritables TIC alors que la plupart des études récentes ont privilégié des mesures du stock de capital informatique ou de l’automatisation des tâches. Ensuite, nous construisons des variables qui permettent de remplacer le traditionnel effet épidémique par différents effets de proximité. Pour finir, nous examinons différents canaux de transmission de la connaissance entre firmes géographiquement proches, depuis le simple spillover non-intentionnel jusqu’à des interactions plus maîtrisées. Notre méthodologie empirique traite les biais habituellement rencontrés dans les équations d’adoption technologique ainsi que dans les tests de la complémentarité organisationnelle

    Diagnosis and impacts of non-Gaussianity of innovations in data assimilation

    Get PDF
    International audienceThe Best Linear Unbiased Estimator (BLUE) has widely been used in atmospheric and oceanic data assimilation (DA). However, when the errors from data (observations and background forecasts) have non-Gaussian probability density functions (pdfs), the BLUE diers from the absolute Minimum Variance Unbiased Estimator (MVUE), minimizing the mean square a posteriori error. The non-Gaussianity of errors can be due to the inherent statistical skewness and positiveness of some physical observables (e.g. moisture, chemical species) or because of the nonlinearity of the data assimilation models and observation operators acting on Gaussian errors. Non-Gaussianity of assimilated data errors can be justied from a priori hypotheses or inferred from statistical diagnostics of innovations (observation minus background). ollowing this rationale, we compute measures of innovation non-Gaussianity, namely its skewness and kurtosis, relating it to: a) the non-Gaussianity of the individual error themselves, b) the correlation between nonlinear functions of errors, and c) the heteroscedasticity of errors within diagnostic samples. Those relationships impose bounds for skewness and kurtosis of errors which are critically dependent on the error variances, thus leading to a necessary tuning of error variances in order to accomplish consistency with innovations. We evaluate the sub-optimality of the BLUE as compared to the MVUE, in terms of excess of error variance, under the presence of non-Gaussian errors. The error pdfs are obtained by the maximum entropy method constrained by error moments up to fourth order, from which the Bayesian probability density function and the MVUE are computed. The impact is higher for skewed extreme innovations and grows in average with the skewness of data errors, especially if those skewnesses have the same sign. Application has been performed to the quality-accepted ECMWF innovations of brightness temperatures of a set of High Resolution Infrared Sounder (HIRS) channels. In this context, the MVUE has led in some extreme cases to a potential reduction of 20-60% error variance as compared to the BLUE

    Estimation of errors in the inverse modeling of accidental release of atmospheric pollutant: Application to the reconstruction of the cesium-137 and iodine-131 source terms from the Fukushima Daiichi power plant

    Get PDF
    International audienceA major difficulty when inverting the source term of an atmospheric tracer dispersion problem is the estimation of the prior errors: those of the atmospheric transport model, those ascribed to the representativity of the measurements, those that are instrumental, and those attached to the prior knowledge on the variables one seeks to retrieve. In the case of an accidental release of pollutant, the reconstructed source is sensitive to these assumptions. This sensitivity makes the quality of the retrieval dependent on the methods used to model and estimate the prior errors of the inverse modeling scheme. We propose to use an estimation method for the errors' amplitude based on the maximum likelihood principle. Under semi-Gaussian assumptions, it takes into account, without approximation, the positivity assumption on the source. We apply the method to the estimation of the Fukushima Daiichi source term using activity concentrations in the air. The results are compared to an L-curve estimation technique and to Desroziers's scheme. The total reconstructed activities significantly depend on the chosen method. Because of the poor observability of the Fukushima Daiichi emissions, these methods provide lower bounds for cesium-137 and iodine-131 reconstructed activities. These lower bound estimates, 1.2 × 1016 Bq for cesium-137, with an estimated standard deviation range of 15%-20%, and 1.9 − 3.8 × 1017 Bq for iodine-131, with an estimated standard deviation range of 5%-10%, are of the same order of magnitude as those provided by the Japanese Nuclear and Industrial Safety Agency and about 5 to 10 times less than the Chernobyl atmospheric releases

    Towards the operational application of inverse modelling for the source identification and plume forecast of an accidental release of radionuclides

    Get PDF
    International audienceIn the event of an accidental atmospheric release of radionuclides from a nuclear power plant, accurate real-time forecasting of the activity concentrations of radionuclides is required by the decision makers for the preparation of adequate countermeasures. Yet, the accuracy of the forecast plume is highly dependent on the source term estimation. Inverse modelling and data assimilation techniques should help in that respect. In this presentation, a semi-automatic method is proposed for the sequential reconstruction of the plume, by implementing a sequential data assimilation algorithm based on inverse modelling, with a care to develop realistic methods for operational risk agencies. The performance of the assimilation scheme has been assessed through the intercomparison between French and Finnish frameworks. Three dispersion models have been used: Polair3D, with or without plume-in-grid, both developed at CEREA, and SILAM, developed at FMI. Different release locations, as well as different meteorological situations are tested. The existing and newly planned surveillance networks are used and realistically large observational errors are assumed. Statistical indicators to evaluate the efficiency of the method are presented and the results are discussed. In addition, in the case where the power plant responsible for the accidental release is not known, robust statistical tools aredeveloped and tested to discriminate candidate release sites

    Towards the operational estimation of a radiological plume using data assimilation after a radiological accidental atmospheric release

    Get PDF
    International audienceIn the event of an accidental atmospheric release of radionuclides from a nuclear power plant, accurate real-time forecasting of the activity concentrations of radionuclides is required by the decision makers for the preparation of adequate countermeasures. The accuracy of the forecast plume is highly dependent on the source term estimation. On several academic test cases, including real data, inverse modelling and data assimilation techniques were proven to help in the assessment of the source term. In this paper, a semi-automatic method is proposed for the sequential reconstruction of the plume, by implementing a sequential data assimilation algorithm based on inverse modelling, with a care to develop realistic methods for operational risk agencies. The performance of the assimilation scheme has been assessed through the intercomparison between French and Finnish frameworks. Two dispersion models have been used: Polair3D and Silam developed in two different research centres. Different release locations, as well as different meteorological situations are tested. The existing and newly planned surveillance networks are used and realistically large multiplicative observational errors are assumed. The inverse modelling scheme accounts for strong error bias encountered with such errors. The efficiency of the data assimilation system is tested via statistical indicators. For France and Finland, the average performance of the data assimilation system is strong. However there are outlying situations where the inversion fails because of a too poor observability. In addition, in the case where the power plant responsible for the accidental release is not known, robust statistical tools are developed and tested to discriminate candidate release sites

    Estimation of the caesium-137 source term from the Fukushima Daiichi nuclear power plant using a consistent joint assimilation of air concentration and deposition observations

    Get PDF
    International audienceInverse modelling techniques can be used to estimate the amount of radionuclides and the temporal profile of the source term released in the atmosphere during the accident of the Fukushima Daiichi nuclear power plant in March 2011. In Winiarek et al. (2012b), the lower bounds of the caesium-137 and iodine-131 source terms were estimated with such techniques, using activity concentration measurements. The importance of an objective assessment of prior errors (the observation errors and the background errors) was emphasised for a reliable inversion. In such critical context where the meteorological conditions can make the source term partly unobservable and where only a few observations are available, such prior estimation techniques are mandatory, the retrieved source term being very sensitive to this estimation. We propose to extend the use of these techniques to the estimation of prior errors when assimilating observations from several data sets. The aim is to compute an estimate of the caesium-137 source term jointly using all available data about this radionuclide, such as activity concentrations in the air, but also daily fallout measurements and total cumulated fallout measurements. It is crucial to properly and simultaneously estimate the background errors and the prior errors relative to each data set. A proper estimation of prior errors is also a necessary condition to reliably estimate the a posteriori uncertainty of the estimated source term. Using such techniques, we retrieve a total released quantity of caesium-137 in the interval 11.6 − 19.3 PBq with an estimated standard deviation range of 15 − 20% depending on the method and the data sets. The "blind" time intervals of the source term have also been strongly mitigated compared to the first estimations with only activity concentration data

    Embracing the Unreliability of Memory Devices for Neuromorphic Computing

    Full text link
    The emergence of resistive non-volatile memories opens the way to highly energy-efficient computation near- or in-memory. However, this type of computation is not compatible with conventional ECC, and has to deal with device unreliability. Inspired by the architecture of animal brains, we present a manufactured differential hybrid CMOS/RRAM memory architecture suitable for neural network implementation that functions without formal ECC. We also show that using low-energy but error-prone programming conditions only slightly reduces network accuracy

    Barbery – Les Tuileries de Barbery

    Get PDF
    La documentation écrite de la fin du Moyen Âge et de l’époque moderne relative au bâti dans la région de Caen mentionne régulièrement l’utilisation de la tuile de Barbery comme matériau de couverture. Au xvie s., dans ses Recherches et antiquités, Charles de Bourgueville signale ainsi « les tuileries de Barbery » comme une des activités importantes à proximité de Caen. On ignore pourtant presque tout de la chronologie de ces établissements, de la nature des productions, des modes de commercia..
    • …
    corecore