5 research outputs found
Recommended from our members
Enhancing Seismic Calibration Research Through Software Automation and Scientific Information Management
Recommended from our members
Enhancing Seismic Calibration Research Through Software Automation and Scientific Information Management
The National Nuclear Security Administration (NNSA) Ground-Based Nuclear Explosion Monitoring Research and Engineering (GNEM R&E) Program at LLNL has made significant progress enhancing the process of deriving seismic calibrations and performing scientific integration, analysis, and information management with software automation tools. Several achievements in schema design, data visualization, synthesis, and analysis were completed this year. Our tool efforts address the problematic issues of very large datasets and varied formats encountered during seismic calibration research. As data volumes have increased, scientific information management issues such as data quality assessment, ontology mapping, and metadata collection that are essential for production and validation of derived calibrations have negatively impacted researchers abilities to produce products. New information management and analysis tools have resulted in demonstrated gains in efficiency of producing scientific data products and improved accuracy of derived seismic calibrations. Significant software engineering and development efforts have produced an object-oriented framework that provides database centric coordination between scientific tools, users, and data. Nearly a half billion parameters, signals, measurements, and metadata entries are all stored in a relational database accessed by an extensive object-oriented multi-technology software framework that includes elements of stored procedures, real-time transactional database triggers and constraints, as well as coupled Java and C++ software libraries to handle the information interchange and validation requirements. Significant resources were applied to schema design to enable recording of processing flow and metadata. A core capability is the ability to rapidly select and present subsets of related signals and measurements to the researchers for analysis and distillation both visually (JAVA GUI client applications) and in batch mode (instantiation of multi-threaded applications on clusters of processors). Development of efficient data exploitation methods has become increasingly important throughout academic and government seismic research communities to address multi-disciplinary large scale initiatives. Effective frameworks must also simultaneously provide the researcher with robust measurement and analysis tools that can handle and extract groups of events effectively and isolate the researcher from the now onerous task of database management and metadata collection necessary for validation and error analysis. Sufficient information management robustness is required to avoid loss of metadata that would lead to incorrect calibration results in addition to increasing the data management burden. Our specific automation methodology and tools improve the researchers ability to assemble quality-controlled research products for delivery into the NNSA Knowledge Base (KB). The software and scientific automation tasks also provide the robust foundation upon which synergistic and efficient development of, GNEM R&E Program, seismic calibration research may be built
Recommended from our members
Regional Seismic Discrimination Optimization With and Without Nuclear Test Data: Western U.S. Examples
The western U.S. has abundant natural seismicity, historic nuclear explosion data, and widespread mine blasts, making it a good testing ground to study the performance of regional source-type discrimination techniques. We have assembled and measured a large set of these events to systematically explore how to best optimize discrimination performance. Nuclear explosions can be discriminated from a background of earthquakes using regional phase (Pn, Pg, Sn, Lg) amplitude measures such as high frequency P/S ratios. The discrimination performance is improved if the amplitudes can be corrected for source size and path length effects. We show good results are achieved using earthquakes alone to calibrate for these effects with the MDAC technique (Walter and Taylor, 2001). We show significant further improvement is then possible by combining multiple MDAC amplitude ratios using an optimized weighting technique such as Linear Discriminant Analysis (LDA). However this requires data or models for both earthquakes and explosions. In many areas of the world regional distance nuclear explosion data is lacking, but mine blast data is available. Mine explosions are often designed to fracture and/or move rock, giving them different frequency and amplitude behavior than contained chemical shots, which seismically look like nuclear tests. Here we explore discrimination performance differences between explosion types, the possible disparity in the optimization parameters that would be chosen if only chemical explosions were available and the corresponding effect of that disparity on nuclear explosion discrimination. There are a variety of additional techniques in the literature also having the potential to improve regional high frequency P/S discrimination. We explore two of these here: three-component averaging and maximum phase amplitude measures. Typical discrimination studies use only the vertical component measures and for some historic regional nuclear records these are all that are available. However S-waves are often better recorded on the horizontal components and some studies have shown that using a three-component average or a vertical-P/horizontal-S or other three-component measure can improve discrimination over using the vertical alone (e.g. Kim et al. 1997; Bowers et al 2001). Here we compare the performance of vertical and three-component measures on the western U. S. test set. A complication in regional discrimination is the variation in P and S-wave propagation with region. The dominantly observed regional high frequency S-wave can vary with path between Sn and Lg in a spatially complex way. Since the relative lack of high frequency S-waves is the signature of an explosion, failing to account for this could lead to misidentifying an earthquake as an explosion. The regional P phases Pn and Pg vary similarly with path and also with distance, with Pg sometimes being a strong phase at near regional distances but not far regional. One way to try and handle these issues is to correct for all four regional phases but choose the phase with the maximum amplitude. A variation on this strategy is to always use Pn but choose the maximum S phase (e.g. Bottone et al. 2002). Here we compare the discrimination performance of several different (max P)/(max S) measures to vertical, three-component and multivariate measures. Our preliminary results show that multivariate measures perform much better than single ratios, though transportability of the LDA weights between regions is an issue. Also in our preliminary results, we do not find large discrimination performance improvements with three-component averages and maximum phase amplitude measures compared to using the vertical component alone
Recommended from our members
Enhancing Seismic Calibration Research Through Software Automation and Scientific Information Management
The National Nuclear Security Administration (NNSA) Ground-Based Nuclear Explosion Monitoring Research and Engineering (GNEM R&E) Program has made significant progress enhancing the process of deriving seismic calibrations and performing scientific integration with automation tools. We present an overview of our software automation and scientific data management efforts and discuss frameworks to address the problematic issues of very large datasets and varied formats utilized during seismic calibration research. The software and scientific automation initiatives directly support the rapid collection of raw and contextual seismic data used in research, provide efficient interfaces for researchers to measure/analyze data, and provide a framework for research dataset integration. The automation also improves the researchers ability to assemble quality controlled research products for delivery into the NNSA Knowledge Base (KB). The software and scientific automation tasks provide the robust foundation upon which synergistic and efficient development of, GNEM R&E Program, seismic calibration research may be built. The task of constructing many seismic calibration products is labor intensive and complex, hence expensive. However, aspects of calibration product construction are susceptible to automation and future economies. We are applying software and scientific automation to problems within two distinct phases or ''tiers'' of the seismic calibration process. The first tier involves initial collection of waveform and parameter (bulletin) data that comprise the ''raw materials'' from which signal travel-time and amplitude correction surfaces are derived and is highly suited for software automation. The second tier in seismic research content development activities include development of correction surfaces and other calibrations. This second tier is less susceptible to complete automation, as these activities require the judgment of scientists skilled in the interpretation of often highly unpredictable event observations. Even partial automation of this second tier, through development of prototype tools to extract observations and make many thousands of scientific measurements, has significantly increased the efficiency of the scientists who construct and validate integrated calibration surfaces. This achieved gain in efficiency and quality control is likely to continue and even accelerate through continued application of information science and scientific automation. Data volume and calibration research requirements have increased by several orders of magnitude over the past decade. Whereas it was possible for individual researchers to download individual waveforms and make time-consuming measurements event by event in the past, with the Terabytes of data available today, a software automation framework must exist to efficiently populate and deliver quality data to the researcher. This framework must also simultaneously provide the researcher with robust measurement and analysis tools that can handle and extract groups of events effectively and isolate the researcher from the now onerous task of database management and metadata collection necessary for validation and error analysis. Lack of information management robustness or loss of metadata can lead to incorrect calibration results in addition to increasing the data management burden. To address these issues we have succeeded in automating several aspects of collection, parsing, reconciliation and extraction tasks, individually. Several software automation prototypes have been produced and have resulted in demonstrated gains in efficiency of producing scientific data products. Future software automation tasks will continue to leverage database and information management technologies in addressing additional scientific calibration research tasks