214 research outputs found

    QuakeFlow: A Scalable Machine-learning-based Earthquake Monitoring Workflow with Cloud Computing

    Full text link
    Earthquake monitoring workflows are designed to detect earthquake signals and to determine source characteristics from continuous waveform data. Recent developments in deep learning seismology have been used to improve tasks within earthquake monitoring workflows that allow the fast and accurate detection of up to orders of magnitude more small events than are present in conventional catalogs. To facilitate the application of machine-learning algorithms to large-volume seismic records, we developed a cloud-based earthquake monitoring workflow, QuakeFlow, that applies multiple processing steps to generate earthquake catalogs from raw seismic data. QuakeFlow uses a deep learning model, PhaseNet, for picking P/S phases and a machine learning model, GaMMA, for phase association with approximate earthquake location and magnitude. Each component in QuakeFlow is containerized, allowing straightforward updates to the pipeline with new deep learning/machine learning models, as well as the ability to add new components, such as earthquake relocation algorithms. We built QuakeFlow in Kubernetes to make it auto-scale for large datasets and to make it easy to deploy on cloud platforms, which enables large-scale parallel processing. We used QuakeFlow to process three years of continuous archived data from Puerto Rico, and found more than a factor of ten more events that occurred on much the same structures as previously known seismicity. We applied Quakeflow to monitoring frequent earthquakes in Hawaii and found over an order of magnitude more events than are in the standard catalog, including many events that illuminate the deep structure of the magmatic system. We also added Kafka and Spark streaming to deliver real-time earthquake monitoring results. QuakeFlow is an effective and efficient approach both for improving realtime earthquake monitoring and for mining archived seismic data sets

    On Efficiently Partitioning a Topic in Apache Kafka

    Full text link
    Apache Kafka addresses the general problem of delivering extreme high volume event data to diverse consumers via a publish-subscribe messaging system. It uses partitions to scale a topic across many brokers for producers to write data in parallel, and also to facilitate parallel reading of consumers. Even though Apache Kafka provides some out of the box optimizations, it does not strictly define how each topic shall be efficiently distributed into partitions. The well-formulated fine-tuning that is needed in order to improve an Apache Kafka cluster performance is still an open research problem. In this paper, we first model the Apache Kafka topic partitioning process for a given topic. Then, given the set of brokers, constraints and application requirements on throughput, OS load, replication latency and unavailability, we formulate the optimization problem of finding how many partitions are needed and show that it is computationally intractable, being an integer program. Furthermore, we propose two simple, yet efficient heuristics to solve the problem: the first tries to minimize and the second to maximize the number of brokers used in the cluster. Finally, we evaluate its performance via large-scale simulations, considering as benchmarks some Apache Kafka cluster configuration recommendations provided by Microsoft and Confluent. We demonstrate that, unlike the recommendations, the proposed heuristics respect the hard constraints on replication latency and perform better w.r.t. unavailability time and OS load, using the system resources in a more prudent way.Comment: This work has been submitted to the IEEE for possible publication. Copyright may be transferred without notice, after which this version may no longer be accessible. This work was funded by the European Union's Horizon 2020 research and innovation programme MARVEL under grant agreement No 95733

    Moment tensor and focal mechanism data of earthquakes recorded by Servicio Geológico Colombiano from 2014 to 2021

    Get PDF
    El Servicio Geológico Colombiano presenta los tensores de momento sísmico y mecanismos focales calculados para sismos localizados en el territorio nacional y regiones fronterizas desde 2014 hasta 2021. Estas soluciones se obtuvieron usando diferentes métodos basados en inversión de formas de onda (SWIFT, SCMTV, Fase W e ISOLA) y polaridades de primeros arribos (FPFIT). Esta información se ha organizado en una base de datos y se ha dispuesto al público mediante una página web por medio de la cual se pueden hacer búsquedas por fechas, área circular o cuadrante. Las soluciones del centroide del tensor de momento son fundamentales para comprender la geometría de la falla, la fuente sísmica que produce un sismo, su magnitud, y la energía liberada por el mismo. Igualmente, gracias a esta información es posible hacer interpretaciones sobre la tectónica de placas, análisis de esfuerzos de la corteza terrestre y su dinámica, modelos dinámicos y cinemáticos de la fuente, análisis de fallas activas y potencial tsunamigénico de sismos, entre otros aspectos

    DARE: A Reflective Platform Designed to Enable Agile Data-Driven Research on the Cloud

    Get PDF
    The DARE platform has been designed to help research developers deliver user-facing applications and solutions over diverse underlying e-infrastructures, data and computational contexts. The platform is Cloud-ready, and relies on the exposure of APIs, which are suitable for raising the abstraction level and hiding complexity. At its core, the platform implements the cataloguing and execution of fine-grained and Python-based dispel4py workflows as services. Reflection is achieved via a logical knowledge base, comprising multiple internal catalogues, registries and semantics, while it supports persistent and pervasive data provenance. This paper presents design and implementation aspects of the DARE platform, as well as it provides directions for future development.PublishedSan Diego (CA, USA)3IT. Calcolo scientific

    DARE: A Reflective Platform Designed to Enable Agile Data-Driven Research on the Cloud

    Get PDF
    The DARE platform has been designed to help research developers deliver user-facing applications and solutions over diverse underlying e-infrastructures, data and computational contexts. The platform is Cloud-ready, and relies on the exposure of APIs, which are suitable for raising the abstraction level and hiding complexity. At its core, the platform implements the cataloguing and execution of fine-grained and Python-based dispel4py workflows as services. Reflection is achieved via a logical knowledge base, comprising multiple internal catalogues, registries and semantics, while it supports persistent and pervasive data provenance. This paper presents design and implementation aspects of the DARE platform, as well as it provides directions for future development.PublishedSan Diego (CA, USA)3IT. Calcolo scientific

    A Geophysical and Field Survey in Central New Hampshire to Search for the Source Region of the Magnitude 6.5 Earthquake of 1638

    Get PDF
    Thesis advisor: John E. EbelIn 1638, an earthquake with an estimated MLg of 6.5 ± 0.5 struck New England and adjacent southeastern Canada producing severe shaking in Boston, Massachusetts and Trois-Rivieres, Quebec. Previously published analyses of felt reports place the possible epicenter somewhere within a broad region including NY, NH, VT and ME. The possible source region had been further refined by the application of Omori's Law rate of aftershock decay combined with estimated rupture extent based on modern seismicity, which together suggest that a seismic event of MLg 6.5 ± 0.5 could have occurred in central New Hampshire in 1638. In order to more clearly define the possible active fault for this earthquake and determine its seismotectonic framework within central New Hampshire, three geophysical methods were used to analyze recent, digitally recorded seismic data. The three methods are a relative location analysis, computation of focal mechanisms and computation of focal depths based on fundamental mode Rayleigh waves. The combined results of the analyses are consistent with a thrust fault trending NNW - SSE and possibly dipping eastward in this postulated 1638 epicentral zone. Modern earthquakes in the postulated source area of the 1638 earthquake occur at focal depths of ~3 to 10 km with many of the events occurring below 5 km, suggesting, that this is the depth range of the 1638 rupture. Depending on the depth of the pre-Silurian basement of the Central Maine Terrane, the source of the MLg 6.5 ± 0.5 earthquake of 1638 may be a basement-involved thrust fault or a reactivated east-dipping thrust fault located between the nappes of the overlying Silurian-Devonian aged metasedimentary rocks. When the postulated fault plane is projected to the surface, portions of the Pemigewasset and Merrimack Rivers are found to flow within its surface expression, which suggests that the courses of these rivers may be fault controlled. A fourth research technique, a field survey, was undertaken to search for earthquake-induced liquefaction features along the Pemigewasset, Merrimack and Winnipesaukee Rivers as well as of the Suncook River Avulsion site. Several small strata-bound soft-sediment deformation structures were found during the survey. Although some of the features may be seismically induced, they may also have formed as the result of depositional processes and therefore cannot be attributed to the 1638 earthquake.Thesis (MS) — Boston College, 2013.Submitted to: Boston College. Graduate School of Arts and Sciences.Discipline: Earth and Environmental Sciences

    M_L:M_0 as a regional seismic discriminant

    Get PDF
    The m_b:M_S ratio determined by teleseismic observations has proven to be an effective discriminant, for explosive sources tend to be significantly richer in short-period energy than are earthquakes. Unfortunately, this method is limited by the detection threshold of teleseismic surface waves. However, recent advances in instrumentation allowing low amplitude surface wave measurements coupled with new analytical techniques make it feasible to use regional waveform data to determine the long-period source excitation level of low magnitude events. We propose using the ratio of M_L (local magnitude) to M_0 (scalar seismic moment) as an analogous regional discriminant. We applied this criterion to a data set of 299 earthquakes and 178 explosions and found that this ratio seems to be diagnostic of source type. For a given M_0, the M_L of an explosion is more than 0.5 magnitude units larger than that of an earthquake. This separation of populations with respect to source type can be attributed to the fact that M_L is a short-period (1 Hz) energy measurement, whereas seismic moment is determined from long-period body wave phases (period > 4 s) and surface waves (10 to 40 sec). Using regional stations with sources 200 to 600 km away, the effective threshold for magnitude measurements for this discriminant is found to be M_L = 3.1 for earthquakes and M_L = 3.6 for explosions. This method does require the determination of regional crustal models and path calibrations from master events or by other means

    Seismic hazard studies in Egypt

    Get PDF
    Abstract The study of earthquake activity and seismic hazard assessment of Egypt is very important due to the great and rapid spreading of large investments in national projects, especially the nuclear power plant that will be held in the northern part of Egypt. Although Egypt is characterized by low seismicity, it has experienced occurring of damaging earthquake effect through its history. The seismotectonic sitting of Egypt suggests that large earthquakes are possible particularly along the Gulf of Aqaba–Dead Sea transform, the Subduction zone along the Hellenic and Cyprean Arcs, and the Northern Red Sea triple junction point. In addition some inland significant sources at Aswan, Dahshour, and Cairo-Suez District should be considered. The seismic hazard for Egypt is calculated utilizing a probabilistic approach (for a grid of 0.5° × 0.5°) within a logic-tree framework. Alternative seismogenic models and ground motion scaling relationships are selected to account for the epistemic uncertainty. Seismic hazard values on rock were calculated to create contour maps for four ground motion spectral periods and for different return periods. In addition, the uniform hazard spectra for rock sites for different 25 periods, and the probabilistic hazard curves for Cairo, and Alexandria cities are graphed. The peak ground acceleration (PGA) values were found close to the Gulf of Aqaba and it was about 220 gal for 475 year return period. While the lowest (PGA) values were detected in the western part of the western desert and it is less than 25 gal

    Intraplate earthquakes and the state of stress in oceanic lithosphere

    Get PDF
    Thesis (Ph.D.)--Massachusetts Institute of Technology, Dept. of Earth, Atmospheric and Planetary Sciences, 1984.Microfiche copy available in Archives and Science.Bibliography: leaves 378-403.by Eric Allen Bergman.Ph.D

    Improving the determination of moment tensors, moment magnitudes and focal depths of earthquakes below Mw 4.0 using regional broadband seismic data:

    Get PDF
    Thesis advisor: Michael J. NaughtonThesis advisor: John E. EbelDetermining accurate source parameters of small magnitude earthquakes is important to understand the source physics and tectonic processes that activate a seismic source as well as to make more accurate estimates of the probabilities of the recurrences of large earthquakes based on the statistics of smaller earthquakes. The accurate determination of the focal depths and focal mechanisms of small earthquakes is required to constrain the potential seismic source zones of future large earthquakes, whereas the accurate determination of seismic moment is required to calculate the sizes (best represented by moment magnitudes) of earthquakes. The precise determination of focal depths, moment magnitudes and focal mechanisms of small earthquakes can help greatly advance our knowledge of the potentially active faults in an area and thus help to produce accurate seismic hazard and risk maps for that area. Focal depths, moment magnitudes and focal mechanisms of earthquakes with magnitudes Mw 4.0 and less recorded by a sparse seismic network are usually poorly constrained due to the lack of an appropriate method applicable to find these parameters with a sparse set of observations. This dissertation presents a new method that can accurately determine focal depths, moment magnitudes and focal mechanisms of earthquakes with magnitudes between Mw 4.0 and Mw 2.5 using the broadband seismic waveforms recorded by the local and regional seismic stations. For the determination of the focal depths and the moment magnitudes, the observed seismograms as well as synthetic seismograms are filtered through a bandpass filter of 1-3 Hz, whereas for the determination of the focal mechanisms, they are filtered through a bandpass filter of 1.5-2.5 Hz. Both of these frequency passbands have a good signal-to-noise ratio (SNR) for the small earthquakes of the magnitudes that are analyzed in this dissertation. The waveforms are processed to their envelopes in order to make the waveforms relatively simple for the modeling. A grid search is performed over all possible dip, rake and strike angles and as well as over possible depths and scalar moments to find the optimal value of the focal depth and the optimal value of the scalar moment. To find the optimal focal mechanism, a non-linear moment-tensor inversion is performed in addition to the coarse grid search over the possible dip, rake and strike angles at a fixed value of focal depth and a fixed value of scalar moment. The method of this dissertation is tested on 18 aftershocks of Mw between 3.70 and 2.60 of the 2011 Mineral, Virginia Mw 5.7 earthquake. The method is also tested on 5 aftershocks of Mw between 3.62 and 2.63 of the 2013 Ladysmith, Quebec Mw 4.5 earthquake. Reliable focal depths and moment magnitudes are obtained for all of these events using waveforms from as few as 1 seismic station within the epicentral distance of 68-424 km with SNR greater or equal to 5. Similarly, reliable focal mechanisms are obtained for all of the events with Mw 3.70-3.04 using waveforms from at least 3 seismic stations within the epicentral distance of 60-350 km each with SNR greater or equal to 10. Tests show that the moment magnitudes and focal depths are not very sensitive to the crustal model used, although systematic variations in the focal depths are observed with the total crustal thickness. Tests also show that the focal mechanisms obtained with the different crustal structures vary with the Kagan angle of 30o on average for the events and the crustal structures tested. This means that the event moment magnitudes and event focal mechanism determinations are only somewhat sensitive to the uncertainties in the crustal models tested. The method is applied to some aftershocks of the Mw 7.8, 2015 Gorkha, Nepal earthquake which shows that the method developed in this dissertation, by analyzing data from eastern North America, appears to give good results when applied in a very different tectonic environment in a different part of the world. This study confirms that the method of modeling envelopes of seismic waveforms developed in this dissertation can be used to extract accurate focal depths and moment magnitudes of earthquakes with Mw 3.70-2.60 using broadband seismic data recorded by local and regional seismic stations at epicentral distances of 68-424 km and accurate focal mechanisms of earthquakes with Mw 3.70-3.04 using broadband seismic data recorded by local and regional seismic stations at epicentral distances of 60-350 km.Thesis (PhD) — Boston College, 2019.Submitted to: Boston College. Graduate School of Arts and Sciences.Discipline: Physics
    • …
    corecore