68 research outputs found

    Dynamic Data-Driven Event Reconstruction for Atmospheric Releases

    Get PDF
    Accidental or terrorist releases of hazardous materials into the atmosphere can impact large populations and cause significant loss of life or property damage. Plume predictions have been shown to be extremely valuable in guiding an effective and timely response. The two greatest sources of uncertainty in the prediction of the consequences of hazardous atmospheric releases result from poorly characterized source terms and lack of knowledge about the state of the atmosphere as reflected in the available meteorological data. We have developed a new event reconstruction methodology that provides probabilistic source term estimates from field measurement data for both accidental and clandestine releases. Accurate plume dispersion prediction requires the following questions to be answered: What was released? When was it released? How much material was released? Where was it released? We have developed a dynamic-data-driven event reconstruction capability that couples data and predictive methods through Bayesian inference to obtain a solution to this inverse problem. The solution consists of a probability distribution of unknown source term parameters. For consequence assessment, we then use this probability distribution to construct a 'composite' forward plume prediction that accounts for the uncertainties in the source term. Since in most cases of practical significance it is impossible to find a closed form solution, Bayesian inference is accomplished by utilizing stochastic sampling methods. This approach takes into consideration both measurement and forward model errors and thus incorporates all the sources of uncertainty in the solution to the inverse problem. Stochastic sampling methods have the additional advantage of being suitable for problems characterized by a non-Gaussian distribution of source term parameters and for cases in which the underlying dynamical system is nonlinear. We initially developed a Markov Chain Monte Carlo (MCMC) stochastic methodology and demonstrated its effectiveness by reconstructing a wide range of release scenarios, using synthetic as well as real-world data. Data for evaluation of our event reconstruction capability were drawn from the short-range Prairie Grass, Copenhagen, and Joint Urban 2003 field experiments and a continental-scale real-world accidental release in Algeciras, Spain. The method was tested using a variety of forward models, including a Gaussian puff dispersion model INPUFF, the regional-to-continental scale Lagrangian dispersion model LODI (the work-horse real-time operational dispersion model used by the National Atmospheric Release Advisory Center), the empirical urban model UDM, and the building-scale computational fluid dynamics code FEM3MP. The robustness of the Bayesian methodology was demonstrated via the use of subsets of the available concentration data and by introducing error into some of the measurements (Fig. 1). These tests showed that the Bayesian approach is capable of providing reliable estimates of source characteristics even in cases of limited or significantly corrupted data. An example of an urban release scenario is shown in Fig. 2. For more effective treatment of strongly time-dependent problems, we developed a Sequential Monte Carlo (SMC) approach. To achieve the best performance under a wide range of conditions we combined SMC and MCMC sampling into a hybrid methodology. We compared the effectiveness and advantages of this approach relative to MCMC using a set of synthetic data examples. We created a modular, scalable computational framework to accommodate the full set of stochastic methodologies (e.g., MCMC, SMC, hybrid stochastic algorithms, 'Green's function', 'reciprocal' methods), as well as a selection of key classes of dispersion models. This design provides a clear separation of stochastic algorithms from predictive models and supports parallelization at both the stochastic algorithm and individual model level. In other words, it supports a parallel stochastic algorithm (e.g., SMC) that invokes parallel forward models. The framework is written in Python and utilizes pyMPI. It invokes forward models either through system calls or as shared objects. Our dynamic-data-driven event reconstruction capability seamlessly integrates observational data streams with predictive models, in order to provide the best possible estimates of unknown source-term parameters, as well as optimal and timely situation analyses consistent with both models and data. This new methodology is shown to be both flexible and robust, adaptable for use with any atmospheric dispersion model, and suitable for use in operational emergency response applications

    Crystal structure of 4-bromo-2-(1H-pyrazol-3-yl)phenol, C9H7BrN2O (vol 232, pg 507, 2017)

    Get PDF
    C9H7BrN2O, monoclinic, C2/c (no. 15), a = 16.255(3) angstrom, b = 4.4119(9) angstrom, c = 25.923(5) angstrom, beta = 107.99(3)degrees, V = 1768.2(7) angstrom(3), Z = 8, R-gt(F) = 0.0450, wR(ref)(F-2) = 0.0960, T = 150 K

    Convectively Induced Secondary Circulations in Fine-Grid Mesoscale Numerical Weather Prediction Models

    Get PDF
    Mesoscale numerical weather prediction models using fine-grid [O(1) km] meshes for weather forecasting, environmental assessment, and other applications capture aspects of larger-than-grid-mesh size, convectively induced secondary circulations (CISCs) such as cells and rolls that occur in the convective planetary boundary layer (PBL). However, 1-km grid spacing is too large for the simulation of the interaction of CISCs with smaller-scale turbulence. The existence of CISCs also violates the neglect of horizontal gradients of turbulent quantities in current PBL schemes. Both aspects—poorly resolved CISCs and a violation of the assumptions behind PBL schemes—are examples of what occurs in Wyngaard’s “terra incognita,” where horizontal grid spacing is comparable to the scale of the simulated motions. Thus, model CISCs (M-CISCs) cannot be simulated reliably. This paper describes how the superadiabatic layer in the lower convective PBL together with increased horizontal resolution allow the critical Rayleigh number to be exceeded and thus allow generation of M-CISCs like those in nature; and how the M-CISCs eventually neutralize the virtual temperature stratification, lowering the Rayleigh number and stopping their growth. Two options for removing M-CISCs while retaining their fluxes are 1) introducing nonlocal closure schemes for more effective removal of heat from the surface and 2) restricting the effective Rayleigh number to remain subcritical. It is demonstrated that CISCs are correctly handled by large-eddy simulation (LES) and thus may provide a way to improve representation of them or their effects. For some applications, it may suffice to allow M-CISCs to develop, but account for their shortcomings during interpretation
    • …
    corecore