75 research outputs found
OSI Passive Seismic Experiment at the Former Nevada Test Site
On-site inspection (OSI) is one of the four verification provisions of the Comprehensive Nuclear Test Ban Treaty (CTBT). Under the provisions of the CTBT, once the Treaty has entered into force, any signatory party can request an on-site inspection, which can then be carried out after approval (by majority voting) of the Executive Council. Once an OSI is approved, a team of 40 inspectors will be assembled to carry out an inspection to ''clarify whether a nuclear weapon test explosion or any other nuclear explosion has been carried out in violation of Article I''. One challenging aspect of carrying out an on-site inspection (OSI) in the case of a purported underground nuclear explosion is to detect and locate the underground effects of an explosion, which may include an explosion cavity, a zone of damaged rock, and/or a rubble zone associated with an underground collapsed cavity. The CTBT (Protocol, Section II part D, paragraph 69) prescribes several types of geophysical investigations that can be carried out for this purpose. One of the methods allowed by the CTBT for geophysical investigation is referred to in the Treaty Protocol as ''resonance seismometry''. This method, which was proposed and strongly promoted by Russia during the Treaty negotiations, is not described in the Treaty. Some clarification about the nature of the resonance method can be gained from OSI workshop presentations by Russian experts in the late 1990s. Our understanding is that resonance seismometry is a passive method that relies on seismic reverberations set up in an underground cavity by the passage of waves from regional and teleseismic sources. Only a few examples of the use of this method for detection of underground cavities have been presented, and those were done in cases where the existence and precise location of an underground cavity was known. As is the case with many of the geophysical methods allowed during an OSI under the Treaty, how resonance seismology really works and its effectiveness for OSI purposes has yet to be determined. For this experiment, we took a broad approach to the definition of ''resonance seismometry''; stretching it to include any means that employs passive seismic methods to infer the character of underground materials. In recent years there have been a number of advances in the use of correlation and noise analysis methods in seismology to obtain information about the subsurface. Our objective in this experiment was to use noise analysis and correlation analysis to evaluate these techniques for detecting and characterizing the underground damage zone from a nuclear explosion. The site that was chosen for the experiment was the Mackerel test in Area 4 of the former Nevada Test Site (now named the Nevada National Security Site, or NNSS). Mackerel was an underground nuclear test of less than 20 kT conducted in February of 1964 (DOENV-209-REV 15). The reason we chose this site is because there was a known apical cavity occurring at about 50 m depth above a rubble zone, and that the site had been investigated by the US Geological Survey with active seismic methods in 1965 (Watkins et al., 1967). Note that the time delay between detonation of the explosion (1964) and the time of the present survey (2010) is nearly 46 years - this would not be typical of an expected OSI under the CTBT
Recommended from our members
A Study of the Dielectric Properties of Dry and Saturated Green River Oil Shale
We measured dielectric permittivity of dry and fluid-saturated Green River oil shale samples over a frequency range of 1 MHz to 1.8 GHz. Dry sample measurements were carried out between room temperature and 146 C, saturated sample measurements were carried out at room temperature. Samples obtained from the Green River formation of Wyoming and from the Anvil Points Mine in Colorado were cored both parallel and perpendicular to layering. The samples, which all had organic richness in the range of 10-45 gal/ton, showed small variations between samples and a relatively small level of anisotropy of the dielectric properties when dry. The real and imaginary part of the relative dielectric permittivity of dry rock was nearly constant over the frequency range observed, with low values for the imaginary part (loss factor). Saturation with de-ionized water and brine greatly increased the values of the real and imaginary parts of the relative permittivity, especially at the lower frequencies. Temperature effects were relatively small, with initial increases in permittivity to about 60 C, followed by slight decreases in permittivity that diminished as temperature increased. Implications of these observations for the in situ electromagnetic, or radio frequency (RF) heating of oil shale to produce oil and gas are discussed
Recommended from our members
MODEL-BASED HYDROACOUSTIC BLOCKAGE ASSESSMENT AND DEVELOPMENT OF AN EXPLOSIVE SOURCE DATABASE
We are continuing the development of the Hydroacoustic Blockage Assessment Tool (HABAT) which is designed for use by analysts to predict which hydroacoustic monitoring stations can be used in discrimination analysis for any particular event. The research involves two approaches (1) model-based assessment of blockage, and (2) ground-truth data-based assessment of blockage. The tool presents the analyst with a map of the world, and plots raypath blockages from stations to sources. The analyst inputs source locations and blockage criteria, and the tool returns a list of blockage status from all source locations to all hydroacoustic stations. We are currently using the tool in an assessment of blockage criteria for simple direct-path arrivals. Hydroacoustic data, predominantly from earthquake sources, are read in and assessed for blockage at all available stations. Several measures are taken. First, can the event be observed at a station above background noise? Second, can we establish backazimuth from the station to the source. Third, how large is the decibel drop at one station relative to other stations. These observational results are then compared with model estimates to identify the best set of blockage criteria and used to create a set of blockage maps for each station. The model-based estimates are currently limited by the coarse bathymetry of existing databases and by the limitations inherent in the raytrace method. In collaboration with BBN Inc., the Hydroacoustic Coverage Assessment Model (HydroCAM) that generates the blockage files that serve as input to HABAT, is being extended to include high-resolution bathymetry databases in key areas that increase model-based blockage assessment reliability. An important aspect of this capability is to eventually include reflected T-phases where they reliably occur and to identify the associated reflectors. To assess how well any given hydroacoustic discriminant works in separating earthquake and in-water explosion populations it is necessary to have both a database of reference earthquake events and of reference in-water explosive events. Although reference earthquake events are readily available, explosive reference events are not. Consequently, building an in-water explosion reference database requires the compilation of events from many sources spanning a long period of time. We have developed a database of small implosive and explosive reference events from the 2003 Indian Ocean Cruise data. These events were recorded at some or all of the IMS Indian Ocean hydroacoustic stations: Diego Garcia, Cape Leeuwin, and Crozet Island. We have also reviewed many historical large in-water explosions and identified five that have adequate source information and can be positively associated to the hydrophone recordings. The five events are: Cannekin, Longshot, CHASE-3, CHASE-5, and IITRI-1. Of these, the first two are nuclear tests on land but near water. The latter three are in-water conventional explosive events with yields from ten to hundreds of tons TNT equivalent. The objective of this research is to enhance discrimination capabilities for events located in the world's oceans. Two research and development efforts are needed to achieve this: (1) improvement in discrimination algorithms and their joint statistical application to events, and (2) development of an automated and accurate blockage prediction capability that will identify all stations and phases (direct and reflected) from a given event that will have adequate signal to be used in a discrimination analysis. The strategy for improving blockage prediction in the world's oceans is to improve model-based prediction of blockage and to develop a ground-truth database of reference events to assess blockage. Currently, research is focused on the development of a blockage assessment software tool. The tool is envisioned to develop into a sophisticated and unifying package that optimally and automatically assesses both model and data based blockage predictions in all ocean basins, for all NDC stations, and accounting for reflected phases (Pulli et al., 2000). Currently, we have focused our efforts on the Diego Garcia, Cape Leeuwin and Crozet Island hydroacoustic stations in the Indian Ocean
Recommended from our members
Injection monitoring with seismic arrays and adaptive noise cancellation
Although the application of seismic methods, active and passive, to monitor in-situ reservoir stimulation processes is not new, seismic arrays and array processing technology coupled with a new noise cancellation method has not been attempted. Successful application of seismic arrays to passively monitor in-situ reservoir stimulation processes depends on being able to sufficiently cancel the expected large amplitude background seismic noise typical of an oil or geothermal production environment so that small amplitude seismic signals occurring at depth can be detected and located. This report describes the results of a short field experiment conducted to test both the application of seismic arrays for in-situ reservoir stimulation monitoring and the active noise cancellation technique in a real reservoir production environment. Although successful application of these techniques to in-situ reservoir stimulation monitoring would have the greatest payoff in the oil industry, the proof-of-concept field experiment site was chosen to be the Geysers geothermal field in northern California. This site was chosen because of known high seismicity rates, a relatively shallow production depth, cooperation and some cost sharing the UNOCAL Oil Corporation, and the close proximity of the site to LLNL. The body of this report describes the Geysers field experimental configuration and then discusses the results of the seismic array processing and the results of the seismic noise cancellation followed by a brief conclusion. 2 refs., 11 figs
Recommended from our members
Detecting, Locating, and Characterizing Remote Power Sources
A feasibility study to determine range and back-azimuth detection methods for an isolated generator powering common loads was completed. The study deployed 3-component E and B field sensors with sampling rates of 100 kHz in a low noise test location in Southern California. Scripted power and load cycling was recorded at ranges of 40 meters to 4 km from the generator/load source. Three loads were tested: a 100 meter string of lights, an inverter powering an air blower, and a resistive heater. No E-field or B-field radiated signals were detected at ranges greater than 40 meters with a signal-to-noise ratio greater than one. Large variations in the broadband background electromagnetic noise were observed and may have been responsible for null detections at some measurement locations. At the 40-meter station, a frequency shift upon generator loading was observed for all load types. Harmonics from the detuned generator (operating at 56.7 Hz) could be observed for all load types but were most pronounced for the inverter source. A back-azimuth estimation methodology was applied to detected harmonics with stable and consistent results. For the inverter source, consistent back azimuths to the source were determined for the fundamental and higher detected harmonics up to the 31st. The method was applied to narrow band ''noise'' at 60 Hz and produced bimodal directions that roughly pointed to large population centers. Details of the method are withheld in this report pending a record of invention submittal. Although the generator/load combinations, which utilized wiring that tended to minimize stray signals, cannot yet be detected at large stand-off range without application of noise-filtering methods, the back-azimuth method appears promising and should be applied to other source types and frequency ranges where an E and B field can be detected. A record of invention describing this new back-azimuth method has been submitted to the Intellectual Property Law Group
SPE2 Far-field Seismic Data Quicklook
The purpose of this report is to provide a brief overview of the far-field seismic data collected by the array of instruments (Figures 1 and 2) deployed by the Source Physics experiment for shots 1 (roughly 100 kg TNT equivalent at a depth of 60 m) and shot 2, (roughly 2000 kg TNT equivalent at a depth of 45 m). 'Far-field' is taken to refer to instruments in the zone of purely elastic response at distances of 100 m or greater. The primary focus is data from the main instrument array and hence data from other groups is not considered. Infrasound data is not addressed nor any remote sensing data. Data processing was done at LLNL in parallel with the effort at UNR. Raw reftek data was sent via hard disk from NsTec. Reftek data was converted to SEGY and then to SAC format. Data files were renamed according to station and channel information. Reftek logs were reviewed. These data have been reviewed for consistency with the UNR data on the server. The primary goal was quality check and a summary is provided in Tables 1 and 2
- …