4,154 research outputs found
Lawrence Livermore National Laboratory Safeguards and Security quarterly progress report ending March 31, 1996
LLNL carries out safeguards and security activities for DOE Office of Safeguards and Security (OSS) and other organizations within and outside DOE. LLNL is supporting OSS in 6 areas: safeguards technology, safeguards and materials accountability, computer security--distributed systems, complex-wide access control, standardization of security systems, and information technology and security center. This report describes the activities in each of these areas
The cranial biomechanics and feeding performance ofHomo floresiensis
Homo floresiensis is a small-bodied hominin from Flores, Indonesia, that exhibits plesiomorphic dentognathic features, including large premolars and a robust mandible, aspects of which have been considered australopith-like. However, relative to australopith species, H. floresiensis exhibits reduced molar size and a cranium with diminutive midfacial dimensions similar to those of later Homo, suggesting a reduction in the frequency of forceful biting behaviours. Our study uses finite-element analysis to examine the feeding biomechanics of the H. floresiensis cranium. We simulate premolar (P3) and molar (M2) biting in a finite-element model (FEM) of the H. floresiensis holotype cranium (LB1) and compare the mechanical results with FEMs of chimpanzees, modern humans and a sample of australopiths (MH1, Sts 5, OH5). With few exceptions, strain magnitudes in LB1 resemble elevated levels observed in modern Homo. Our analysis of LB1 suggests that H. floresiensis could produce bite forces with high mechanical efficiency, but was subject to tensile jaw joint reaction forces during molar biting, which perhaps constrained maximum postcanine bite force production. The inferred feeding biomechanics of H. floresiensis closely resemble modern humans, suggesting that this pattern may have been present in the last common ancestor of Homo sapiens and H. floresiensis
Sensitivity Analysis in the Presence of Intrinsic Stochasticity for Discrete Fracture Network Simulations
Large-scale discrete fracture network (DFN) simulators are standard fare for
studies involving the sub-surface transport of particles since direct
observation of real world underground fracture networks is generally
infeasible. While these simulators have seen numerous successes over several
engineering applications, estimations on quantities of interest (QoI) - such as
breakthrough time of particles reaching the edge of the system - suffer from a
two distinct types of uncertainty. A run of a DFN simulator requires several
parameter values to be set that dictate the placement and size of fractures,
the density of fractures, and the overall permeability of the system;
uncertainty on the proper parameter choices will lead to some amount of
uncertainty in the QoI, called epistemic uncertainty. Furthermore, since DFN
simulators rely on stochastic processes to place fractures and govern flow,
understanding how this randomness affects the QoI requires several runs of the
simulator at distinct random seeds. The uncertainty in the QoI attributed to
different realizations (i.e. different seeds) of the same random process leads
to a second type of uncertainty, called aleatoric uncertainty. In this paper,
we perform a Sensitivity Analysis, which directly attributes the uncertainty
observed in the QoI to the epistemic uncertainty from each input parameter and
to the aleatoric uncertainty. We make several design choices to handle an
observed heteroskedasticity in DFN simulators, where the aleatoric uncertainty
changes for different inputs, since the quality makes several standard
statistical methods inadmissible. Beyond the specific takeaways on which input
variables affect uncertainty the most for DFN simulators, a major contribution
of this paper is the introduction of a statistically rigorous workflow for
characterizing the uncertainty in DFN flow simulations that exhibit
heteroskedasticity.Comment: 23 pages, 6 figures, journal articl
Recommended from our members
Lawrence Livermore National Laboratory safeguards and security quarterly progress report to the U.S. Department of Energy. Quarter ending September 30, 1996
The paper describes tasks undertaken in each of the following areas: Safeguards technology program (STP); Safeguards and material accountability (SMA); Computer security, distributed systems; Complex-wide access control system (CWAC); and Standardization of security systems (SSS). The STP develops advanced, nondestructive analysis technology for measurement of special nuclear materials. Work focuses on R and D relating to X- and gamma-ray spectrometry and to development of computer codes for interpreting the spectral data obtained by these techniques. The SMA is concerned with four areas: insider protection; material accountability; planning and evaluation; and information security. The Computer Security Technology Center provides expertise and solutions to the many information security problems present in today`s computer systems and networks. Incidents of intrusions, computer viruses, the purposeful replacement of legitimate software for illegal purposes, and similar acts are being addressed by the creation of security software, the delivery of incident response expertise, and research and development into secure systems. The purpose of the CWAC is to develop an approach that will allow visitors to use their DOE standard badge in access control systems throughout the DOE complex. The purpose of the SSS project is to support the standardization of security systems to meet DOE orders and requirements, and to support the DOE in offering relevant security technology and capabilities to Federal standardization efforts
Observation of a multimode plasma response and its relationship to density pumpout and edge-localized mode suppression
Density pumpout and edge-localized mode (ELM) suppression by applied n=2 magnetic fields in low-collisionality DIII-D plasmas are shown to be correlated with the magnitude of the plasma response driven on the high-field side (HFS) of the magnetic axis but not the low-field side (LFS) midplane. These distinct responses are a direct measurement of a multimodal magnetic plasma response, with each structure preferentially excited by a different n=2 applied spectrum and preferentially detected on the LFS or HFS. Ideal and resistive magneto-hydrodynamic (MHD) calculations find that the LFS measurement is primarily sensitive to the excitation of stable kink modes, while the HFS measurement is primarily sensitive to resonant currents (whether fully shielding or partially penetrated). The resonant currents are themselves strongly modified by kink excitation, with the optimal applied field pitch for pumpout and ELM suppression significantly differing from equilibrium field alignment.This material is based upon work supported by the U.S.
Department of Energy, Office of Science, Office of Fusion
Energy Sciences, using the DIII-D National Fusion Facility,
a DOE Office of Science user facility, under Awards No. DE-FC02-04ER54698, No. DE-AC02-09CH11466,
No. DE-FG02-04ER54761, No. DE-AC05-06OR23100,
No. DE-SC0001961, and No. DE-AC05-00OR22725.
S. R. H. was supported by AINSE and ANSTO
Lawrence Livermore National Laboratory safeguards and security quarterly progress report to the U.S. Department of Energy. Quarter ending September 30, 1996
The paper describes tasks undertaken in each of the following areas: Safeguards technology program (STP); Safeguards and material accountability (SMA); Computer security, distributed systems; Complex-wide access control system (CWAC); and Standardization of security systems (SSS). The STP develops advanced, nondestructive analysis technology for measurement of special nuclear materials. Work focuses on R and D relating to X- and gamma-ray spectrometry and to development of computer codes for interpreting the spectral data obtained by these techniques. The SMA is concerned with four areas: insider protection; material accountability; planning and evaluation; and information security. The Computer Security Technology Center provides expertise and solutions to the many information security problems present in today`s computer systems and networks. Incidents of intrusions, computer viruses, the purposeful replacement of legitimate software for illegal purposes, and similar acts are being addressed by the creation of security software, the delivery of incident response expertise, and research and development into secure systems. The purpose of the CWAC is to develop an approach that will allow visitors to use their DOE standard badge in access control systems throughout the DOE complex. The purpose of the SSS project is to support the standardization of security systems to meet DOE orders and requirements, and to support the DOE in offering relevant security technology and capabilities to Federal standardization efforts
New species of Macrocranion (Mammalia, Lipotyphla) from the earliest Eocene of North America and its biogeographic implications
373-384http://deepblue.lib.umich.edu/bitstream/2027.42/48665/2/ID532.pd
RELICS: The Reionization Lensing Cluster Survey and the Brightest High-z Galaxies
Massive foreground galaxy clusters magnify and distort the light of objects behind them, permitting a view into both the extremely distant and intrinsically faint galaxy populations. We present here the z ~ 6-8 candidate high-redshift galaxies from the Reionization Lensing Cluster Survey (RELICS), a Hubble and Spitzer Space Telescope survey of 41 massive galaxy clusters spanning an area of ≈200 arcmin². These clusters were selected to be excellent lenses, and we find similar high-redshift sample sizes and magnitude distributions as the Cluster Lensing And Supernova survey with Hubble (CLASH). We discover 257, 57, and eight candidate galaxies at z ~ 6, 7, and 8 respectively, (322 in total). The observed (lensed) magnitudes of the z ~ 6 candidates are as bright as AB mag ~23, making them among the brightest known at these redshifts, comparable with discoveries from much wider, blank-field surveys. RELICS demonstrates the efficiency of using strong gravitational lenses to produce high-redshift samples in the epoch of reionization. These brightly observed galaxies are excellent targets for follow-up study with current and future observatories, including the James Webb Space Telescope
Progress in the Development of the 1 m Model of the 70 mm Aperture Quadrupole for the LHC Low- Insertions
Within the LHC magnet development program Oxford Instruments has built a one metre model of the 70Â mm aperture low-beta quadrupole. The magnet features a four layer coil wound from two 8.2 mm wide graded NbTi cables, and is designed for 250Â T/m at 1.9Â K. The magnet has previously been tested between 4.5Â K and 2.3Â K. In this paper we review the magnet rebuild and the subsequent tests. Results on magnet training at 4.3Â K and 1.9Â K are presented along with the results related to quench protection studies.
Photocurrent measurements of supercollision cooling in graphene
The cooling of hot electrons in graphene is the critical process underlying
the operation of exciting new graphene-based optoelectronic and plasmonic
devices, but the nature of this cooling is controversial. We extract the hot
electron cooling rate near the Fermi level by using graphene as novel
photothermal thermometer that measures the electron temperature () as it
cools dynamically. We find the photocurrent generated from graphene
junctions is well described by the energy dissipation rate , where the heat capacity is and is the
base lattice temperature. These results are in disagreement with predictions of
electron-phonon emission in a disorder-free graphene system, but in excellent
quantitative agreement with recent predictions of a disorder-enhanced
supercollision (SC) cooling mechanism. We find that the SC model provides a
complete and unified picture of energy loss near the Fermi level over the wide
range of electronic (15 to 3000 K) and lattice (10 to 295 K) temperatures
investigated.Comment: 7pages, 5 figure
- …