49 research outputs found
Recommended from our members
The role of dynamic experimentation for computation analysis
In this paper a brief description of dynamic techniques commonly available for determining material property studies is presented. For many impact applications, the material generally experiences a complex loading path. In most cases, the initial loading conditions can be represented by the shock commonly referred to as the Hugoniot state. Subsequent loading or release structure, i.e., off-Hugoniot states would however be dependent on the physical processes dominating the material behavior. The credibility of the material model is tested by the accuracy of predictions of off-Hugoniot states. Experimental techniques commonly used to determine off-Hugoniot states are discussed in this survey
Axial focusing of impact energy in the Earth's interior: Proof-of-principle tests of a new hypothesis
A causal link between major impact events and global processes would probably require a significant change in the thermal state of the Earth's interior, presumably brought about by coupling of impact energy. One possible mechanism for such energy coupling from the surface to the deep interior would be through focusing due to axial symmetry. Antipodal focusing of surface and body waves from earthquakes is a well-known phenomenon which has previously been exploited by seismologists in studies of the Earth's deep interior. Antipodal focusing from impacts on the Moon, Mercury, and icy satellites has also been invoked by planetary scientists to explain unusual surface features opposite some of the large impact structures on these bodies. For example, 'disrupted' terrains have been observed antipodal to the Caloris impact basis on Mercury and Imbrium Basin on the Moon. Very recently there have been speculations that antipodal focusing of impact energy within the mantle may lead to flood basalt and hotspot activity, but there has not yet been an attempt at a rigorous model. A new hypothesis was proposed and preliminary proof-of-principle tests for the coupling of energy from major impacts to the mantle by axial focusing of seismic waves was performed. Because of the axial symmetry of the explosive source, the phases and amplitudes are dependent only on ray parameter (or takeoff angle) and are independent of azimuthal angle. For a symmetric and homogeneous Earth, all the seismic energy radiated by the impact at a given takeoff angle will be refocused (minus attenuation) on the axis of symmetry, regardless of the number of reflections and refractions it has experienced. Mantle material near the axis of symmetry will experience more strain cycles with much greater amplitude than elsewhere and will therefore experience more irreversible heating. The situation is very different than for a giant earthquake, which in addition to having less energy, has an asymmetric focal mechanism and a larger area. Two independent proof-of-principle approaches were used. The first makes use of seismic simulations, which are being performed with a realistic Earth model to determine the degree of focusing along the axis and to estimate the volume of material, if any, that experiences significant irreversible heating. The second involves two-dimensional hydrodynamic code simulations to determine the stress history, internal energy, and temperature rise as a function of radius along the axis
Recommended from our members
Verification Test Suite for Physics Simulation Codes
The DOE/NNSA Advanced Simulation & Computing (ASC) Program directs the development, demonstration and deployment of physics simulation codes. The defensible utilization of these codes for high-consequence decisions requires rigorous verification and validation of the simulation software. The physics and engineering codes used at Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratory (SNL) are arguably among the most complex utilized in computational science. Verification represents an important aspect of the development, assessment and application of simulation software for physics and engineering. The purpose of this note is to formally document the existing tri-laboratory suite of verification problems used by LANL, LLNL, and SNL, i.e., the Tri-Lab Verification Test Suite. Verification is often referred to as ensuring that ''the [discrete] equations are solved [numerically] correctly''. More precisely, verification develops evidence of mathematical consistency between continuum partial differential equations (PDEs) and their discrete analogues, and provides an approach by which to estimate discretization errors. There are two variants of verification: (1) code verification, which compares simulation results to known analytical solutions, and (2) calculation verification, which estimates convergence rates and discretization errors without knowledge of a known solution. Together, these verification analyses support defensible verification and validation (V&V) of physics and engineering codes that are used to simulate complex problems that do not possess analytical solutions. Discretization errors (e.g., spatial and temporal errors) are embedded in the numerical solutions of the PDEs that model the relevant governing equations. Quantifying discretization errors, which comprise only a portion of the total numerical simulation error, is possible through code and calculation verification. Code verification computes the absolute value of discretization errors relative to an exact solution of the governing equations. In contrast, calculation verification, which does not utilize a reference solution, combines an assessment of stable self-convergence and exact solution prediction to quantitatively estimate discretization errors. In FY01, representatives of the V&V programs at LANL, LLNL, and SNL identified a set of verification test problems for the Accelerated Strategic Computing Initiative (ASCI) Program. Specifically, a set of code verification test problems that exercise relevant single- and multiple-physics packages was agreed upon. The verification test suite problems can be evaluated in multidimensional geometry and span both smooth and non-smooth behavior
Recommended from our members
ALEGRA -- code validation: Experiments and simulations
In this study, the authors are providing an experimental test bed for validating features of the ALEGRA code over a broad range of strain rates with overlapping diagnostics that encompass the multiple responses. A unique feature of the Arbitrary Lagrangian Eulerian Grid for Research Applications (ALEGRA) code is that it allows simultaneous computational treatment, within one code, of a wide range of strain-rates varying from hydrodynamic to structural conditions. This range encompasses strain rates characteristic of shock-wave propagation (10{sup 7}/s) and those characteristic of structural response (10{sup 2}/s). Most previous code validation experimental studies, however, have been restricted to simulating or investigating a single strain-rate regime. What is new and different in this investigation is that the authors have performed well-instrumented experiments which capture features relevant to both hydrodynamic and structural response in a single experiment. Aluminum was chosen for use in this study because it is a well characterized material--its EOS and constitutive material properties are well defined over a wide range of loading rates. The current experiments span strain rate regimes of over 10{sup 7}/s to less than 10{sup 2}/s in a single experiment. The input conditions are extremely well defined. Velocity interferometers are used to record the high strain-rate response, while low strain rate data were collected using strain gauges
Clar's Theory, STM Images, and Geometry of Graphene Nanoribbons
We show that Clar's theory of the aromatic sextet is a simple and powerful
tool to predict the stability, the \pi-electron distribution, the geometry, the
electronic/magnetic structure of graphene nanoribbons with different hydrogen
edge terminations. We use density functional theory to obtain the equilibrium
atomic positions, simulated scanning tunneling microscopy (STM) images, edge
energies, band gaps, and edge-induced strains of graphene ribbons that we
analyze in terms of Clar formulas. Based on their Clar representation, we
propose a classification scheme for graphene ribbons that groups configurations
with similar bond length alternations, STM patterns, and Raman spectra. Our
simulations show how STM images and Raman spectra can be used to identify the
type of edge termination
Recommended from our members
Prediction and Uncertainty in Computational Modeling of Complex Phenomena: A Whitepaper
This report summarizes some challenges associated with the use of computational science to predict the behavior of complex phenomena. As such, the document is a compendium of ideas that have been generated by various staff at Sandia. The report emphasizes key components of the use of computational to predict complex phenomena, including computational complexity and correctness of implementations, the nature of the comparison with data, the importance of uncertainty quantification in comprehending what the prediction is telling us, and the role of risk in making and using computational predictions. Both broad and more narrowly focused technical recommendations for research are given. Several computational problems are summarized that help to illustrate the issues we have emphasized. The tone of the report is informal, with virtually no mathematics. However, we have attempted to provide a useful bibliography that would assist the interested reader in pursuing the content of this report in greater depth
Recommended from our members
Confidence in ASCI scientific simulations
The US Department of Energy`s (DOE) Accelerated Strategic Computing Initiative (ASCI) program calls for the development of high end computing and advanced application simulations as one component of a program to eliminate reliance upon nuclear testing in the US nuclear weapons program. This paper presents results from the ASCI program`s examination of needs for focused validation and verification (V and V). These V and V activities will ensure that 100 TeraOP-scale ASCI simulation code development projects apply the appropriate means to achieve high confidence in the use of simulations for stockpile assessment and certification. The authors begin with an examination of the roles for model development and validation in the traditional scientific method. The traditional view is that the scientific method has two foundations, experimental and theoretical. While the traditional scientific method does not acknowledge the role for computing and simulation, this examination establishes a foundation for the extension of the traditional processes to include verification and scientific software development that results in the notional framework known as Sargent`s Framework. This framework elucidates the relationships between the processes of scientific model development, computational model verification and simulation validation. This paper presents a discussion of the methodologies and practices that the ASCI program will use to establish confidence in large-scale scientific simulations. While the effort for a focused program in V and V is just getting started, the ASCI program has been underway for a couple of years. The authors discuss some V and V activities and preliminary results from the ALEGRA simulation code that is under development for ASCI. The breadth of physical phenomena and the advanced computational algorithms that are employed by ALEGRA make it a subject for V and V that should typify what is required for many ASCI simulations