177 research outputs found

    Seismotectonics of the San Andreas Fault System Between Point Arena and Cape Mendocino in Northern California\u27 Implications for the Development and Evolution of a Young Transform

    Get PDF
    The northernmost and relatively youthful segment of the San Andreas fault system is situated within a 100+ km wide zone of northwest trending strike-slip faults that includes, from west to east, the San Andreas, Maacama, and Bartlett Springs faults. Although the San Andreas fault is the principal strike-slip fault in this system, it has been virtually aseismic since the 1906 earthquake. Moderate levels of seismicity locate to the east along the Maacama fault and, to a lesser extent, the Bartlett Springs fault at focal depths typical of other strike-slip faults within the San Andreas fault system in central California. North of the San Andreas fault system, within the Cape Mendocino area, earthquakes occur at depths of up to 40 km and primarily reflect internal deformation of the subducting Garda slab, and slip along the Mendocino Fracture Zone. Seismicity along the Maacama and Bartlett Springs faults is dominated by right-lateral to oblique-reverse slip along fault planes that dip 50 °-75 ° to the northeast. The northern extent of seismicity along these faults terminates near the surface projection of the southern edge of the Garda slab. The onset of seismicity along these faults may be related to the abrupt change in the elastic thickness of the North American plate as it enters the asthenaspheric window. The Maacama and Bartlett Springs faults are strike-parallel with active reverse faults within the forearc region of the Cascadia subductian zone. This preexisting structural fabric of northwest trending reverse faults in the forearc area appears to have strongly influenced the initial slip and complexity of these faults. Continuation of the moderately dipping Maacama fault to the southeast along the steeply dipping Healdsburg and Rodgers Creek fault zones and the near-vertical Hayward and Calaveras fault zones in the San Francisco Bay area suggests that these faults evolve toward a more vertical dip to minimize the shear stresses that tend to resist plate motion

    Fault healing inferred from time dependent variations in source properties of repeating earthquakes

    Get PDF
    We analyze two sets of repeating earthquakes on the Calaveras fault to estimate in-situ rates of fault strengthening (healing). Earthquake recurrence intervals t, range from 3 to 803 days. Variations in relative moment and duration are combined to study changes in stress drop, rupture dimension, rupture velocity, and particle velocity as a function of tr. Healing rates and source variations are compared with predictions of laboratory derived friction laws. Two interpretations of event duration Ï„ are used: one in which Ï„: is given by the ratio of slip to particle velocity and one in which it scales as rupture dimension divided by rupture velocity. Our data indicate that faults strengthen during the interseismic period. We infer that source dimension decreases with tr due to aseismic creep within the region surrounding the repeatinge vents. Stress drop increases 1-3MPa per decade increase in tr, which represents an increase of a factor of 2-3 relative to events with tr between 10 and 100 days. This rate of fault healing is consistent with extrapolations of laboratory measurements of healing rates if fault strength is high, on order of 60MPa, ands tress drop is roughly 10% of this value

    The Algorithm Theoretical Basis Document for the GLAS Atmospheric Data Products

    Get PDF
    The purpose of this document is to present a detailed description of the algorithm theoretical basis for each of the GLAS data products. This will be the final version of this document. The algorithms were initially designed and written based on the authors prior experience with high altitude lidar data on systems such as the Cloud and Aerosol Lidar System (CALS) and the Cloud Physics Lidar (CPL), both of which fly on the NASA ER-2 high altitude aircraft. These lidar systems have been employed in many field experiments around the world and algorithms have been developed to analyze these data for a number of atmospheric parameters. CALS data have been analyzed for cloud top height, thin cloud optical depth, cirrus cloud emittance (Spinhirne and Hart, 1990) and boundary layer depth (Palm and Spinhirne, 1987, 1998). The successor to CALS, the CPL, has also been extensively deployed in field missions since 2000 including the validation of GLAS and CALIPSO. The CALS and early CPL data sets also served as the basis for the construction of simulated GLAS data sets which were then used to develop and test the GLAS analysis algorithms

    National Seismic System Science Plan

    Get PDF
    Recent developments in digital communication and seismometry are allowing seismologists to propose revolutionary new ways to reduce vulnerability from earthquakes, volcanoes, and tsunamis, and to better understand these phenomena as well as the basic structure and dynamics of the Earth. This document provides a brief description of some of the critical new problems that can be addressed using modem digital seismic networks. It also provides an overview of existing seismic networks and suggests ways to integrate these together into a National Seismic System. A National Seismic System will consist of a number of interconnected regional networks (such as southern California, central and northern California, northeastern United States, northwestern United States, and so on) that are jointly operated by Federal, State, and private seismological research institutions. Regional networks will provide vital information concerning the hazards of specific regions. Parts of these networks will be linked to provide uniform rapid response on a national level (the National Seismic Network). A National Seismic System promises to significantly reduce societal risk to earthquake losses and to open new areas of fundamental basic research. The following is a list of some of the uses of a National Seismic System

    QuakeFlow: A Scalable Machine-learning-based Earthquake Monitoring Workflow with Cloud Computing

    Full text link
    Earthquake monitoring workflows are designed to detect earthquake signals and to determine source characteristics from continuous waveform data. Recent developments in deep learning seismology have been used to improve tasks within earthquake monitoring workflows that allow the fast and accurate detection of up to orders of magnitude more small events than are present in conventional catalogs. To facilitate the application of machine-learning algorithms to large-volume seismic records, we developed a cloud-based earthquake monitoring workflow, QuakeFlow, that applies multiple processing steps to generate earthquake catalogs from raw seismic data. QuakeFlow uses a deep learning model, PhaseNet, for picking P/S phases and a machine learning model, GaMMA, for phase association with approximate earthquake location and magnitude. Each component in QuakeFlow is containerized, allowing straightforward updates to the pipeline with new deep learning/machine learning models, as well as the ability to add new components, such as earthquake relocation algorithms. We built QuakeFlow in Kubernetes to make it auto-scale for large datasets and to make it easy to deploy on cloud platforms, which enables large-scale parallel processing. We used QuakeFlow to process three years of continuous archived data from Puerto Rico, and found more than a factor of ten more events that occurred on much the same structures as previously known seismicity. We applied Quakeflow to monitoring frequent earthquakes in Hawaii and found over an order of magnitude more events than are in the standard catalog, including many events that illuminate the deep structure of the magmatic system. We also added Kafka and Spark streaming to deliver real-time earthquake monitoring results. QuakeFlow is an effective and efficient approach both for improving realtime earthquake monitoring and for mining archived seismic data sets

    The Cloud-Aerosol Transport System (CATS): a New Lidar for Aerosol and Cloud Profiling from the International Space Station

    Get PDF
    Spaceborne lidar profiling of aerosol and cloud layers has been successfully implemented during a number of prior missions, including LITE, ICESat, and CALIPSO. Each successive mission has added increased capability and further expanded the role of these unique measurements in wide variety of applications ranging from climate, to air quality, to special event monitoring (ie, volcanic plumes). Many researchers have come to rely on the availability of profile data from CALIPSO, especially data coincident with measurements from other A-Train sensors. The CALIOP lidar on CALIPSO continues to operate well as it enters its fifth year of operations. However, active instruments have more limited lifetimes than their passive counterparts, and we are faced with a potential gap in lidar profiling from space if the CALIOP lidar fails before a new mission is operational. The ATLID lidar on EarthCARE is not expected to launch until 2015 or later, and the lidar component of NASA's proposed Aerosols, Clouds, and Ecosystems (ACE) mission would not be until after 2020. Here we present a new aerosol and cloud lidar that was recently selected to provide profiling data from the International Space Station (ISS) starting in 2013. The Cloud-Aerosol Transport System (CATS) is a three wavelength (1064, 532, 355 nm) elastic backscatter lidar with HSRL capability at 532 nm. Depolarization measurements will be made at all wavelengths. The primary objective of CATS is to continue the CALIPSO aerosol and cloud profile data record, ideally with overlap between both missions and EarthCARE. In addition, the near real time data capability of the ISS will enable CATS to support operational applications such as air quality and special event monitoring. The HSRL channel will provide a demonstration of technology and a data testbed for direct extinction retrievals in support of ACE mission development. An overview of the instrument and mission will be provided, along with a summary of the science objectives and simulated data

    The Cloud-Aerosol Transport System (CATS): A New Lidar for Aerosol and Cloud Profiling from the International Space Station

    Get PDF
    Spaceborne lidar profiling of aerosol and cloud layers has been successfully implemented during a number of prior missions, including LITE, ICESat, and CALIPSO. Each successive mission has added increased capability and further expanded the role of these unique measurements in wide variety of applications ranging from climate, to air quality, to special event monitoring (ie, volcanic plumes). Many researchers have come to rely on the availability of profile data from CALIPSO, especially data coincident with measurements from other A-Train sensors. The CALIOP lidar on CALIPSO continues to operate well as it enters its fifth year of operations. However, active instruments have more limited lifetimes than their passive counterparts, and we are faced with a potential gap in lidar profiling from space if the CALIOP lidar fails before a new mission is operational. The ATLID lidar on EarthCARE is not expected to launch until 2015 or later, and the lidar component of NASA's proposed Aerosols, Clouds, and Ecosystems (ACE) mission would not be until after 2020. Here we present a new aerosol and cloud lidar that was recently selected to provide profiling data from the International Space Station (ISS) starting in 2013. The Cloud-Aerosol Transport System (CATS) is a three wavelength (1064,532,355 nm) elastic backscatter lidar with HSRL capability at 532 nm. Depolarization measurements will be made at all wavelengths. The primary objective of CATS is to continue the CALIPSO aerosol and cloud profile data record, ideally with overlap between both missions and EarthCARE. In addition, the near real time (NRT) data capability ofthe ISS will enable CATS to support operational applications such as aerosol and air quality forecasting and special event monitoring. The HSRL channel will provide a demonstration of technology and a data testbed for direct extinction retrievals in support of ACE mission development. An overview of the instrument and mission will be provided, along with a summary of the science objectives and simulated data. Input from the ICAP community is desired to help plan our NRT mission goals and interactions with ICAP forecasters

    Induced seismicity response of hydraulic fracturing: results of a multidisciplinary monitoring at the Wysin site, Poland

    Get PDF
    Shale oil and gas exploitation by hydraulic fracturing experienced a strong development worldwide over the last years, accompanied by a substantial increase of related induced seismicity, either consequence of fracturing or wastewater injection. In Europe, unconventional hydrocarbon resources remain underdeveloped and their exploitation controversial. In UK, fracturing operations were stopped after the Mw 2.3 Blackpool induced earthquake; in Poland, operations were halted in 2017 due to adverse oil market conditions. One of the last operated well at Wysin, Poland, was monitored independently in the framework of the EU project SHEER, through a multidisciplinary system including seismic, water and air quality monitoring. The hybrid seismic network combines surface mini-arrays, broadband and shallow borehole sensors. This paper summarizes the outcomes of the seismological analysis of these data. Shallow artificial seismic noise sources were detected and located at the wellhead active during the fracturing stages. Local microseismicity was also detected, located and characterised, culminating in two events of Mw 1.0 and 0.5, occurring days after the stimulation in the vicinity of the operational well, but at very shallow depths. A sharp methane peak was detected ~19 hours after the Mw 0.5 event. No correlation was observed between injected volumes, seismicity and groundwater parameters
    • …
    corecore