386 research outputs found
Recommended from our members
Solving the Fokker-Planck equation on a massively parallel computer
The Fokker-Planck package FPPAC had been converted to the Connection Machine 2 (CM2). For fine mesh cases the CM2 outperforms the Cray-2 when it comes to time-integrating the difference equations. For long Legendre expansions the CM2 is also faster at computing the Fokker-Planck coefficients. 3 refs
Novel symmetric and asymmetric plasmonic nanostructures
Metal-dielectric nanostructures capable of supporting electromagnetic resonances at optical frequencies are the vital component of the emerging technology called plasmonics. Plasmon is the electromagnetic wave confined at the metal-dielectric interface, which may effectively couple to the external electromagnetic excitation with the wavelength much larger than the geometric size of the supporting structure. Plasmonics can improve virtually any electromagnetic technology by providing subwavelength waveguides, field enhancing and concentrating structures, and nanometer size wavelength-selective components. The focus of this work is the fabrication, characterization and modeling for novel plasmonic nanostructures. Effects of the symmetry in plasmonic structures are studied. Symmetric metal nanoparticle clusters have been investigated and show highly tunable plasmon resonances with high sensitivity to the dielectric environment. Efficient, highly-scalable methods for nanoparticle self-assembly and controlled partial submicron metal sphere coatings are developed. These partially Au coated dielectric spheres have shown striking properties such as high tunability, as well as the control on resonant electromagnetic field enhancement and scattering direction. Studied effects are of vital importance for plasmonics applications, which may improve virtually any existing electromagnetic technology. Optical resonances in metal-dielectric nanostructures were correlated with LC circuit resonances elaborating on the resonance tunability, dielectric environment, symmetry breaking and mode coupling (Fano resonance) effects
Recommended from our members
Climate Modeling using High-Performance Computing
The Center for Applied Scientific Computing (CASC) and the LLNL Climate and Carbon Science Group of Energy and Environment (E&E) are working together to improve predictions of future climate by applying the best available computational methods and computer resources to this problem. Over the last decade, researchers at the Lawrence Livermore National Laboratory (LLNL) have developed a number of climate models that provide state-of-the-art simulations on a wide variety of massively parallel computers. We are now developing and applying a second generation of high-performance climate models
Dynamic Data-Driven Event Reconstruction for Atmospheric Releases
Accidental or terrorist releases of hazardous materials into the atmosphere can impact large populations and cause significant loss of life or property damage. Plume predictions have been shown to be extremely valuable in guiding an effective and timely response. The two greatest sources of uncertainty in the prediction of the consequences of hazardous atmospheric releases result from poorly characterized source terms and lack of knowledge about the state of the atmosphere as reflected in the available meteorological data. We have developed a new event reconstruction methodology that provides probabilistic source term estimates from field measurement data for both accidental and clandestine releases. Accurate plume dispersion prediction requires the following questions to be answered: What was released? When was it released? How much material was released? Where was it released? We have developed a dynamic-data-driven event reconstruction capability that couples data and predictive methods through Bayesian inference to obtain a solution to this inverse problem. The solution consists of a probability distribution of unknown source term parameters. For consequence assessment, we then use this probability distribution to construct a 'composite' forward plume prediction that accounts for the uncertainties in the source term. Since in most cases of practical significance it is impossible to find a closed form solution, Bayesian inference is accomplished by utilizing stochastic sampling methods. This approach takes into consideration both measurement and forward model errors and thus incorporates all the sources of uncertainty in the solution to the inverse problem. Stochastic sampling methods have the additional advantage of being suitable for problems characterized by a non-Gaussian distribution of source term parameters and for cases in which the underlying dynamical system is nonlinear. We initially developed a Markov Chain Monte Carlo (MCMC) stochastic methodology and demonstrated its effectiveness by reconstructing a wide range of release scenarios, using synthetic as well as real-world data. Data for evaluation of our event reconstruction capability were drawn from the short-range Prairie Grass, Copenhagen, and Joint Urban 2003 field experiments and a continental-scale real-world accidental release in Algeciras, Spain. The method was tested using a variety of forward models, including a Gaussian puff dispersion model INPUFF, the regional-to-continental scale Lagrangian dispersion model LODI (the work-horse real-time operational dispersion model used by the National Atmospheric Release Advisory Center), the empirical urban model UDM, and the building-scale computational fluid dynamics code FEM3MP. The robustness of the Bayesian methodology was demonstrated via the use of subsets of the available concentration data and by introducing error into some of the measurements (Fig. 1). These tests showed that the Bayesian approach is capable of providing reliable estimates of source characteristics even in cases of limited or significantly corrupted data. An example of an urban release scenario is shown in Fig. 2. For more effective treatment of strongly time-dependent problems, we developed a Sequential Monte Carlo (SMC) approach. To achieve the best performance under a wide range of conditions we combined SMC and MCMC sampling into a hybrid methodology. We compared the effectiveness and advantages of this approach relative to MCMC using a set of synthetic data examples. We created a modular, scalable computational framework to accommodate the full set of stochastic methodologies (e.g., MCMC, SMC, hybrid stochastic algorithms, 'Green's function', 'reciprocal' methods), as well as a selection of key classes of dispersion models. This design provides a clear separation of stochastic algorithms from predictive models and supports parallelization at both the stochastic algorithm and individual model level. In other words, it supports a parallel stochastic algorithm (e.g., SMC) that invokes parallel forward models. The framework is written in Python and utilizes pyMPI. It invokes forward models either through system calls or as shared objects. Our dynamic-data-driven event reconstruction capability seamlessly integrates observational data streams with predictive models, in order to provide the best possible estimates of unknown source-term parameters, as well as optimal and timely situation analyses consistent with both models and data. This new methodology is shown to be both flexible and robust, adaptable for use with any atmospheric dispersion model, and suitable for use in operational emergency response applications
Recommended from our members
A scalable implementation of a finite-volume dynamical core in the Community Atmosphere Model
A distributed memory message-passing parallel implementation of a finite-volume discretization of the primitive equations in the Community Atmosphere Model is presented. Due to the data dependencies resulting from the polar singularity of the latitude-longitude coordinate system, we employ two separate domain decompositions within the dynamical core--one in latitude/level space, and the other in longitude/latitude space. This requires that the data be periodically redistributed between these two decompositions. In addition, the domains contain halo regions that cover the nearest neighbor data dependencies. A combination of several techniques, such as one-sided communication and multithreading, are presented to optimize data movements. The resulting algorithm is shown to scale to very large machine configurations, even for relatively coarse resolutions
- …