296 research outputs found
Recommended from our members
Floating into Thin Air
On May 18, 2005, a giant helium balloon carrying the High Energy Focusing Telescope (HEFT) sailed into the spring sky over the deserts of New Mexico. The spindly steel and aluminum gondola that houses the optics, detectors, and other components of the telescope floated for 25 hours after its launch from Fort Sumner, New Mexico. For 21 of those hours, the balloon was nearly 40 kilometers above Earth's surface--almost four times higher than the altitude routinely flown by commercial jet aircraft. In the upper reaches of Earth's atmosphere, HEFT searched the universe for x-ray sources from highly energetic objects such as binary stars, galaxy clusters, and supermassive black holes. Before landing in Arizona, the telescope observed and imaged a dozen scientific targets by capturing photons emitted from these objects in the high-energy (hard) x-ray range (above 10 kiloelectronvolts). Among these targets were the Crab synchrotron nebula, the black hole Cygnus X-1 (one of the brightest x-ray sources in the sky), and the blazar 3C454.3. The scientific data gathered from these targets are among the first focused hard x-ray images returned from high altitudes
Recommended from our members
A Fast Test to Diagnose Flu
People with flu-like symptoms who seek treatment at a medical clinic or hospital often must wait several hours before being examined, possibly exposing many people to an infectious virus. If a patient appears to need more than the routine fluids-and-rest prescription, effective diagnosis requires tests that must be sent to a laboratory. Hours or days may pass before results are available to the doctor, who in the meantime must make an educated guess about the patient's illness. The lengthy diagnostic process places a heavy burden on medical laboratories and can result in improper use of antibiotics or a costly hospital stay. A faster testing method may soon be available. An assay developed by a team of Livermore scientists can diagnose influenza and other respiratory viruses in about two hours once a sample has been taken. Unlike other systems that operate this quickly, the new device, called FluIDx (and pronounced ''fluidics''), can differentiate five types of respiratory viruses, including influenza. FluIDx can analyze samples at the point of patient care--in hospital emergency departments and clinics--allowing medical providers to quickly determine how best to treat a patient, saving time and potentially thousands of dollars per patient. The FluIDx project, which is led by Livermore chemist Mary McBride of the Physics and Advanced Technologies Directorate, received funding from the National Institute of Allergy and Infectious Diseases and the Laboratory Directed Research and Development (LDRD) Program. To test the system and make it as useful as possible, the team worked closely with the Emergency Department staff at the University of California (UC) at Davis Medical Center in Sacramento. Flu kills more than 35,000 people every year in the US. The 2003 outbreak of severe acute respiratory syndrome and the ongoing concern about a possible bird flu pandemic show the need for a fast, reliable test that can differentiate seasonal flu from a potentially pandemic influenza. Such a test should also discriminate influenza from pathogens that cause illnesses with flu-like symptoms. When a precise diagnosis is required to treat an adult patient with serious respiratory symptoms, sample cells are usually obtained with a nasal or throat swab and analyzed with one of several laboratory methods. The gold standard test is viral culturing, a highly sensitive method that can identify the specific strain of virus. However, viral culturing is a labor-intensive process and requires 3-10 days to produce results, too long for early intervention. Enzyme and optical immunoassays offer results in 30 minutes, but these methods are less sensitive than viral culturing so they can produce false positives or negatives. They also cannot distinguish the type of virus found. Direct immunofluorescence antibody (DFA) staining is as sensitive as viral culturing. It also can detect multiple respiratory pathogens simultaneously by a process known as multiplexing. However, DFA staining requires expensive equipment, a skilled microscopist, and samples with enough target cells for testing. In addition, the results are ultimately subjective. Another method, called reverse transcriptase-polymerase chain reaction assay, offers sensitivity and specificity comparable to viral culturing and DFA staining. It also produces results in two hours and can rapidly test a large number of samples. The drawback with these tests, however, is that they must be performed in a laboratory. None of them can be used where they are needed most: in the clinic or emergency department where patients are being treated. Livermore's FluIDx diagnostic system, with its instrumentation and multiplexed assays, is designed specifically for point-of-care diagnosis. The fast, easy-to-use system is based on the Autonomous Pathogen Detection System, a homeland security technology developed by LLNL. This R&D 100 Award-winning technology constantly monitors the air to detect airborne bioterrorism agents, such as anthrax. FluIDx is an integrated system designed to perform highly multiplexed polymerase chain reaction (PCR) nucleic-acid-based assays in real time. The FluIDx system processes a sample, analyzes the data, reports the results, and decontaminates itself before another sample is taken. The device currently uses 16 assays--12 for individual nucleic-acid targets and 4 for internal controls. The assays can simultaneously detect influenza A and B, parainfluenza (Types 1 and 3), respiratory syncytial virus, and adenovirus (Groups B, C, and E)
Recommended from our members
An Accelerated Collaboration Meets with Beaming Success
Maintaining a smaller, aging U.S. nuclear weapons stockpile without underground nuclear testing requires the capability to verify and validate the complex computer calculations on which stockpile confidence is based. This capability, in turn, requires nonnuclear hydrodynamic tests (hydrotests) that can x-ray stages of the implosion process, providing freeze-frame photos of materials imploding at speeds of more than 16,000 kilometers per hour. The images will yield important information on shapes and densities of metals and other materials under the extreme pressures and temperatures generated by the detonation of high explosives. The Dual-Axis Radiographic Hydrodynamics Test (DARHT) Facility at Los Alamos national Laboratory is a two-arm x-ray imaging system that will provide such images, capturing the inner workings of a mock nuclear explosion with high resolution. Scientists compare the radiographic images with computer models, examine the differences, and refine the models to more accurately represent weapon behavior. One of DARHT's arms (now called DARHT-II) recently got a ''leg up'' through a collaboration of Lawrence Livermore and Los Alamos scientists, using a Livermore accelerator to test its subsystems and codes
Recommended from our members
Looping through the Lamb Shift
Sometimes in science, a small measurement can have big ramifications. For a team of Livermore scientists, such was the case when they measured a small shift in the spectrum of extremely ionized atoms of uranium. The measurement involves the Lamb shift, a subtle change in the energy of an electron orbiting an atom's nucleus. The precision of the Livermore result was 10 times greater than that of existing measurements, making it the best measurement to date of a complicated correction to the simplest quantum description of how atoms behave. The measurement introduces a new realm in the search for deviations between the theory of quantum electrodynamics (QED), which is an extension of quantum mechanics, and the real world. Such deviations, if discovered, would have far-reaching consequences, indicating that QED is not a fundamental theory of nature
Recommended from our members
Planets and Stars under the Magnifying Glass
Looking out to the vastness of the night sky, stargazers often ponder questions about the universe, many wondering if planets like ours can be found somewhere out there. But teasing out the details in astronomical data that point to a possible Earth-like planet is exceedingly difficult. To find an extrasolar planet--a planet that circles a star other than the Sun--astrophysicists have in the past searched for Doppler shifts, changes in the wavelength emitted by an object because of its motion. When an astronomical object moves toward an observer on Earth, the light it emits becomes higher in frequency and shifts to the blue end of the spectrum. When the object moves away from the observer, its light becomes lower in frequency and shifts to the red end. By measuring these changes in wavelength, astrophysicists can precisely calculate how quickly objects are moving toward or away from Earth. When a giant planet orbits a star, the planet's gravitational pull on the star produces a small (meters-per-second) back-and-forth Doppler shift in the star's light. Using the Doppler-shift technique, astrophysicists have identified 179 planets within the Milky Way galaxy. However, most of these are giant gas planets, similar in size to Jupiter and Saturn, and they orbit parent stars that are much closer to them than the Sun is to Earth. Planets similar in size to Earth have also been found, but they, too, are so close to their suns that they would be much hotter than Earth and too hot for life to exist. In 2005, an international collaboration of astronomers working with telescope networks throughout the Southern Hemisphere uncovered clues to a small, rocky or icy planet similar to Earth. The new planet, designated ogLE-2005-BLg-290-Lb, is the farthest planet from our solar system detected to date. The discovery was made by the Probing Lensing Anomalies network (PLAnET) using microlensing--a technique developed nearly two decades ago by Livermore astrophysicists as part of the Massively Compact Halo object (MACHo) Project, which searched for evidence of dark matter
Recommended from our members
Keeping an Eye on the Prize
Setting performance goals is part of the business plan for almost every company. The same is true in the world of supercomputers. Ten years ago, the Department of Energy (DOE) launched the Accelerated Strategic Computing Initiative (ASCI) to help ensure the safety and reliability of the nation's nuclear weapons stockpile without nuclear testing. ASCI, which is now called the Advanced Simulation and Computing (ASC) Program and is managed by DOE's National Nuclear Security Administration (NNSA), set an initial 10-year goal to obtain computers that could process up to 100 trillion floating-point operations per second (teraflops). Many computer experts thought the goal was overly ambitious, but the program's results have proved them wrong. Last November, a Livermore-IBM team received the 2005 Gordon Bell Prize for achieving more than 100 teraflops while modeling the pressure-induced solidification of molten metal. The prestigious prize, which is named for a founding father of supercomputing, is awarded each year at the Supercomputing Conference to innovators who advance high-performance computing. Recipients for the 2005 prize included six Livermore scientists--physicists Fred Streitz, James Glosli, and Mehul Patel and computer scientists Bor Chan, Robert Yates, and Bronis de Supinski--as well as IBM researchers James Sexton and John Gunnels. This team produced the first atomic-scale model of metal solidification from the liquid phase with results that were independent of system size. The record-setting calculation used Livermore's domain decomposition molecular-dynamics (ddcMD) code running on BlueGene/L, a supercomputer developed by IBM in partnership with the ASC Program. BlueGene/L reached 280.6 teraflops on the Linpack benchmark, the industry standard used to measure computing speed. As a result, it ranks first on the list of Top500 Supercomputer Sites released in November 2005. To evaluate the performance of nuclear weapons systems, scientists must understand how materials behave under extreme conditions. Because experiments at high pressures and temperatures are often difficult or impossible to conduct, scientists rely on computer models that have been validated with obtainable data. Of particular interest to weapons scientists is the solidification of metals. ''To predict the performance of aging nuclear weapons, we need detailed information on a material's phase transitions'', says Streitz, who leads the Livermore-IBM team. For example, scientists want to know what happens to a metal as it changes from molten liquid to a solid and how that transition affects the material's characteristics, such as its strength
Recommended from our members
A Detector Radioactive Particles Can't Evade
As part of its national security mission, Lawrence Livermore develops technologies to help government agencies prevent terrorists from smuggling nuclear materials into the country. One ongoing effort is to design radiation detectors that can distinguish threat sources from legitimate sources, such as medical isotopes, and naturally occurring radiation. (See S&TR, September 2004, pp. 4-11; May 2006, pp. 4-10.) Detectors intended for use by nonspecialists must be easy to operate and require minimal maintenance. To be most effective, they also must detect both gamma and neutron energies. That may sound like a lot to ask of one instrument, but the Ultrahigh-Resolution Gamma and Neutron Spectrometer (UltraSpec) delivers all of these features. UltraSpec is so sensitive that even the minute thermal energy deposited by a single gamma ray or neutron can be detected with high precision. With this capability, the detector can identify differences in composition that help reveal a material's origin, processing history, and likely intended use. In addition to its application as a counterterrorism technology, UltraSpec can be used to protect nuclear material stored at nuclear power plants, to evaluate weapon stockpiles, and to verify material composition. UltraSpec was developed by a team of scientists and engineers from Livermore's Physics and Advanced Technologies and Engineering directorates working with VeriCold Technologies of Ismaning, Germany. The detector's design builds on a technology base established in three Laboratory Directed Research and Development projects. The UltraSpec team, which is led by Laboratory physicist Stephan Friedrich, received a 2006 R&D 100 Award for the detector's innovative design
The Unexpected Role of Evolving Longitudinal Electric Fields in Generating Energetic Electrons in Relativistically Transparent Plasmas
Superponderomotive-energy electrons are observed experimentally from the
interaction of an intense laser pulse with a relativistically transparent
target. For a relativistically transparent target, kinetic modeling shows that
the generation of energetic electrons is dominated by energy transfer within
the main, classically overdense, plasma volume. The laser pulse produces a
narrowing, funnel-like channel inside the plasma volume that generates a field
structure responsible for the electron heating. The field structure combines a
slowly evolving azimuthal magnetic field, generated by a strong laser-driven
longitudinal electron current, and, unexpectedly, a strong propagating
longitudinal electric field, generated by reflections off the walls of the
funnel-like channel. The magnetic field assists electron heating by the
transverse electric field of the laser pulse through deflections, whereas the
longitudinal electric field directly accelerates the electrons in the forward
direction. The longitudinal electric field produced by reflections is 30 times
stronger than that in the incoming laser beam and the resulting direct laser
acceleration contributes roughly one third of the energy transferred by the
transverse electric field of the laser pulse to electrons of the
super-ponderomotive tail
Resonance structures in the multichannel quantum defect theory for the photofragmentation processes involving one closed and many open channels
The transformation introduced by Giusti-Suzor and Fano and extended by
Lecomte and Ueda for the study of resonance structures in the multichannel
quantum defect theory (MQDT) is used to reformulate MQDT into the forms having
one-to-one correspondence with those in Fano's configuration mixing (CM) theory
of resonance for the photofragmentation processes involving one closed and many
open channels. The reformulation thus allows MQDT to have the full power of the
CM theory, still keeping its own strengths such as the fundamental description
of resonance phenomena without an assumption of the presence of a discrete
state as in CM.Comment: 7 page
- …