43 research outputs found
NNSA ASC Exascale Environment Planning, Applications Working Group, Report February 2011
The scope of the Apps WG covers three areas of interest: Physics and Engineering Models (PEM), multi-physics Integrated Codes (IC), and Verification and Validation (V&V). Each places different demands on the exascale environment. The exascale challenge will be to provide environments that optimize all three. PEM serve as a test bed for both model development and 'best practices' for IC code development, as well as their use as standalone codes to improve scientific understanding. Rapidly achieving reasonable performance for a small team is the key to maintaining PEM innovation. Thus, the environment must provide the ability to develop portable code at a higher level of abstraction, which can then be tuned, as needed. PEM concentrate their computational footprint in one or a few kernels that must perform efficiently. Their comparative simplicity permits extreme optimization, so the environment must provide the ability to exercise significant control over the lower software and hardware levels. IC serve as the underlying software tools employed for most ASC problems of interest. Often coupling dozens of physics models into very large, very complex applications, ICs are usually the product of hundreds of staff-years of development, with lifetimes measured in decades. Thus, emphasis is placed on portability, maintainability and overall performance, with optimization done on the whole rather than on individual parts. The exascale environment must provide a high-level standardized programming model with effective tools and mechanisms for fault detection and remediation. Finally, V&V addresses the infrastructure and methods to facilitate the assessment of code and model suitability for applications, and uncertainty quantification (UQ) methods for assessment and quantification of margins of uncertainty (QMU). V&V employs both PEM and IC, with somewhat differing goals, i.e., parameter studies and error assessments to determine both the quality of the calculation and to estimate expected deviations of simulations from experiments. The exascale environment must provide a performance envelope suitable both for capacity calculations (high through-put) and full system capability runs (high performance). Analysis of the results place shared demand on both the I/O as well as the visualization subsystems
The Valley-of-Death: Reciprocal sign epistasis constrains adaptive trajectories in a constant, nutrient limiting environment
The fitness landscape is a powerful metaphor for describing the relationship between genotype and phenotype for a population under selection. However, empirical data as to the topography of fitness landscapes are limited, owing to difficulties in measuring fitness for large numbers of genotypes under any condition. We previously reported a case of reciprocal sign epistasis (RSE), where two mutations individually increased yeast fitness in a glucose-limited environment, but reduced fitness when combined, suggesting the existence of two peaks on the fitness landscape. We sought to determine whether a ridge connected these peaks so that populations founded by one mutant could reach the peak created by the other, avoiding the low-fitness Valley-of-Death between them. Sequencing clones after 250 generations of further evolution provided no evidence for such a ridge, but did reveal many presumptive beneficial mutations, adding to a growing body of evidence that clonal interference pervades evolving microbial populations
Recommended from our members
Laser Ray Tracing in a Parallel Arbitrary Lagrangian Eulerian Adaptive Mesh Refinement Hydrocode
Recommended from our members
Interface Reconstruction in Two-and Three-Dimensional Arbitrary Lagrangian-Euderian Adaptive Mesh Refinement Simulations
Modeling of high power laser and ignition facilities requires new techniques because of the higher energies and higher operational costs. We report on the development and application of a new interface reconstruction algorithm for chamber modeling code that combines ALE (Arbitrary Lagrangian Eulerian) techniques with AMR (Adaptive Mesh Refinement). The code is used for the simulation of complex target elements in the National Ignition Facility (NIF) and other similar facilities. The interface reconstruction scheme is required to adequately describe the debris/shrapnel (including fragments or droplets) resulting from energized materials that could affect optics or diagnostic sensors. Traditional ICF modeling codes that choose to implement ALE + AMR techniques will also benefit from this new scheme. The ALE formulation requires material interfaces (including those of generated particles or droplets) to be tracked. We present the interface reconstruction scheme developed for NIF's ALE-AMR and discuss how it is affected by adaptive mesh refinement and the ALE mesh. Results of the code are shown for NIF and OMEGA target configurations
Recommended from our members
Modeling NIF Experimental Designs with Adaptive Mesh Refinement and Lagrangian Hydrodynamics
Incorporation of adaptive mesh refinement (AMR) into Lagrangian hydrodynamics algorithms allows for the creation of a highly powerful simulation tool effective for complex target designs with three-dimensional structure. We are developing an advanced modeling tool that includes AMR and traditional arbitrary Lagrangian-Eulerian (ALE) techniques. Our goal is the accurate prediction of vaporization, disintegration and fragmentation in National Ignition Facility (NIF) experimental target elements. Although our focus is on minimizing the generation of shrapnel in target designs and protecting the optics, the general techniques are applicable to modern advanced targets that include three-dimensional effects such as those associated with capsule fill tubes. Several essential computations in ordinary radiation hydrodynamics need to be redesigned in order to allow for AMR to work well with ALE, including algorithms associated with radiation transport. Additionally, for our goal of predicting fragmentation, we include elastic/plastic flow into our computations. We discuss the integration of these effects into a new ALE-AMR simulation code. Applications of this newly developed modeling tool as well as traditional ALE simulations in two and three dimensions are applied to NIF early-light target designs
Recommended from our members
Hierarchical Material Models for Fragmentation Modeling in NIF-ALE-AMR
Fragmentation is a fundamental process that naturally spans micro to macroscopic scales. Recent advances in algorithms, computer simulations, and hardware enable us to connect the continuum to microstructural regimes in a real simulation through a heterogeneous multiscale mathematical model. We apply this model to the problem of predicting how targets in the NIF chamber dismantle, so that optics and diagnostics can be protected from damage. The mechanics of the initial material fracture depend on the microscopic grain structure. In order to effectively simulate the fragmentation, this process must be modeled at the subgrain level with computationally expensive crystal plasticity models. However, there are not enough computational resources to model the entire NIF target at this microscopic scale. In order to accomplish these calculations, a hierarchical material model (HMM) is being developed. The HMM will allow fine-scale modeling of the initial fragmentation using computationally expensive crystal plasticity, while the elements at the mesoscale can use polycrystal models, and the macroscopic elements use analytical flow stress models. The HMM framework is built upon an adaptive mesh refinement (AMR) capability. We present progress in implementing the HMM in the NIF-ALE-AMR code. Additionally, we present test simulations relevant to NIF targets
Recommended from our members
Experiments for the Validation of Debris and Shrapnel Calculations
The debris and shrapnel generated by laser targets are important factors in the operation of a large laser facility such as NIF, LMJ, and Orion. Past experience has shown that it is possible for such target debris to render diagnostics inoperable and also to penetrate or damage optical protection (debris) shields. We are developing the tools to allow evaluation of target configurations in order to better mitigate the generation and impact of debris, including development of dedicated modeling codes. In order to validate these predictive simulations, we briefly describe a series of experiments aimed at determining the amount of debris and/or shrapnel produced in controlled situations. We use glass and aerogel to capture generated debris/shrapnel. The experimental targets include hohlraums (halfraums) and thin foils in a variety of geometries. Post-shot analysis includes scanning electron microscopy and x-ray tomography. We show the results of some of these experiments and discuss modeling efforts
Recommended from our members
Laser coupling to reduced-scale targets at NIF Early Light
Deposition of maximum laser energy into a small, high-Z enclosure in a short laser pulse creates a hot environment. Such targets were recently included in an experimental campaign using the first four of the 192 beams of the National Ignition Facility [J. A. Paisner, E. M. Campbell, and W. J. Hogan, Fusion Technology 26, 755 (1994)], under construction at the University of California Lawrence Livermore National Laboratory. These targets demonstrate good laser coupling, reaching a radiation temperature of 340 eV. In addition, the Raman backscatter spectrum contains features consistent with Brillouin backscatter of Raman forward scatter [A. B. Langdon and D. E. Hinkel, Physical Review Letters 89, 015003 (2002)]. Also, NIF Early Light diagnostics indicate that 20% of the direct backscatter from these reduced-scale targets is in the polarization orthogonal to that of the incident light
Recommended from our members
The effective I/O bandwidth benchmark (b{_}eff{_}io)
The effective I/O bandwidth benchmark (b{_}eff{_}io) covers two goals: (1) to achieve a characteristic average number for the I/O bandwidth achievable with parallel MPI-I/O applications, and (2) to get detailed information about several access patterns and buffer lengths. The benchmark examines ''first write'', ''rewrite'' and ''read'' access, strided (individual and shared pointers) and segmented collective patterns on one file per application and non-collective access to one file per process. The number of parallel accessing processes is also varied and wellformed I/O is compared with non-wellformed. On systems, meeting the rule that the total memory can be written to disk in 10 minutes, the benchmark should not need more than 15 minutes for a first pass of all patterns. The benchmark is designed analogously to the effective bandwidth benchmark for message passing (b{_}eff) that characterizes the message passing capabilities of a system in a few minutes. First results of the b{_}elf{_}io benchmark are given for IBM SP and Cray T3E systems and compared with existing benchmarks based on parallel Posix-I/O
Helical axis stellarator equilibrium model
An asymptotic model is developed to study MHD equilibria in toroidal systems with a helical magnetic axis. Using a characteristic coordinate system based on the vacuum field lines, the equilibrium problem is reduced to a two-dimensional generalized partial differential equation of the Grad-Shafranov type. A stellarator-expansion free-boundary equilibrium code is modified to solve the helical-axis equations. The expansion model is used to predict the equilibrium properties of Asperators NP-3 and NP-4. Numerically determined flux surfaces, magnetic well, transform, and shear are presented. The equilibria show a toroidal Shafranov shift