80 research outputs found
Recommended from our members
Leveraging time series analysis of radar coherence and normalized difference vegetation index ratios to characterize pre-failure activity of the Mud Creek landslide, California
Assessing landslide activity at large scales has historically been a challenging problem. Here, we present a different approach on radar coherence and normalized difference vegetation index (NDVI) analyses – metrics that are typically used to map landslides post-failure – and leverage a time series analysis to characterize the pre-failure activity of the Mud Creek landslide in California. Our method computes the ratio of mean interferometric coherence or NDVI on the unstable slope relative to that of the surrounding hillslope. This approach has the advantage that it eliminates the negative impacts of long temporal baselines that can interfere with the analysis of interferometric synthetic aperture (InSAR) data, as well as interferences from atmospheric and environmental factors. We show that the coherence ratio of the Mud Creek landslide dropped by 50 % when the slide began to accelerate 5 months prior to its catastrophic failure in 2017. Coincidentally, the NDVI ratio began a near-linear decline. A similar behavior is visible during an earlier acceleration of the landslide in 2016. This suggests that radar coherence and NDVI ratios may be useful for assessing landslide activity. Our study demonstrates that data from the ascending track provide the more reliable coherence ratios, despite being poorly suited to measure the slope's precursory deformation. Combined, these insights suggest that this type of analysis may complement traditional InSAR analysis in useful ways and provide an opportunity to assess landslide activity at regional scales.
Associated data: https://doi.org/10.5281/zenodo.3727304
</p
Topography and self-gravitation interaction in elastic-gravitational modeling
Changes in gravity due to volcanic loading of the crust are influenced by topography. We investigate
the relative importance of topography and self-gravitation in the interpretation of gravity changes. It is
shown that modeling of gravity changes can be more precise with the introduction of topographic relief,
although it is neglected self-gravitation of the medium. This paper exploits this result by suggesting a
mathematical simplification that could be useful in the future development of a numerical technique to
accurately include topographic effects in the modeling of deformation and gravity changes. Finally, we
perform an inversion of the gravity changes observed at Mayon volcano (Philippines) between December
1992 and December 1996 including topographic effects by varying the depth of the source. Failure to
account for topographic influences can bias estimates of source parameters particularly when the lateral
extension of the relief is of the same order of magnitude as the source depth.Peer reviewe
Earthquake forecasting and its verification
No proven method is currently available for the reliable short time
prediction of earthquakes (minutes to months). However, it is possible to make
probabilistic hazard assessments for earthquake risk. These are primarily based
on the association of small earthquakes with future large earthquakes. In this
paper we discuss a new approach to earthquake forecasting. This approach is
based on a pattern informatics (PI) method which quantifies temporal variations
in seismicity. The output is a map of areas in a seismogenic region
(``hotspots'') where earthquakes are forecast to occur in a future 10-year time
span. This approach has been successfully applied to California, to Japan, and
on a worldwide basis. These forecasts are binary--an earthquake is forecast
either to occur or to not occur. The standard approach to the evaluation of a
binary forecast is the use of the relative operating characteristic (ROC)
diagram, which is a more restrictive test and less subject to bias than maximum
likelihood tests. To test our PI method, we made two types of retrospective
forecasts for California. The first is the PI method and the second is a
relative intensity (RI) forecast based on the hypothesis that future
earthquakes will occur where earthquakes have occurred in the recent past.
While both retrospective forecasts are for the ten year period 1 January 2000
to 31 December 2009, we performed an interim analysis 5 years into the
forecast. The PI method out performs the RI method under most circumstances.Comment: 10(+1) pages, 5 figures, 2 tables. Submitted to Nonlinearl Processes
in Geophysics on 5 August 200
Recommended from our members
Improved Real-Time Natural Hazard Monitoring Using Automated DInSAR Time Series
As part of the collaborative GeoSciFramework project, we are establising a monitoring system for the Yellowstone volcanic area that integrates multiple geodetic and seismic data sets into an advanced cyber-infrastructure framework that will enable real-time streaming data analytics and machine learning and allow us to better characterize associated long- and short-term hazards. The goal is to continuously ingest both remote sensing (GNSS, DInSAR) and ground-based (seismic, thermal and gas observations, strainmeter, tiltmeter and gravity measurements) data and query and analyse them in near-real time. In this study, we focus on DInSAR data processing and the effects from using various atmospheric corrections and real-time orbits on the automated processing and results. We find that the atmospheric correction provided by the European Centre for Medium-Range Weather Forecasts (ECMWF) is currently the most optimal for automated DInSAR processing and that the use of real-time orbits is sufficient for the early-warning application in question. We show analysis of atmospheric corrections and using real-time orbits in a test case over the Kilauea volcanic area in Hawaii. Finally, using these findings, we present results of displacement time series in the Yellowstone area between May 2018 and October 2019, which are in good agreement with GNSS data where available. These results will contribute to a baseline model that will be the basis of a future early-warning system that will be continuously updated with new DInSAR data acquisitions.</p
Recommended from our members
Measuring the state and temporal evolution of glaciers in Alaska and Yukon using synthetic-aperture-radar-derived (SAR-derived) 3D time series of glacier surface flow
Climate change has reduced global ice mass over the last 2 decades as enhanced warming has accelerated surface melt and runoff rates. Glaciers have undergone dynamic processes in response to a warming climate that impacts the surface geometry and mass distribution of glacial ice. Until recently no single technique could consistently measure the evolution of surface flow for an entire glaciated region in three dimensions with high temporal and spatial resolution. We have improved upon earlier methods by developing a technique for mapping, in unprecedented detail, the temporal evolution of glaciers. Our software computes north, east, and vertical flow velocity and/or displacement time series from the synthetic aperture radar (SAR) ascending and descending range and azimuth speckle offsets. The software can handle large volumes of satellite data and is designed to work on high-performance computers (HPCs) as well as workstations by utilizing multiple parallelization methods. We then compute flow velocity–displacement time series for glaciers in southeastern Alaska during 2016–2021 and observe seasonal and interannual variations in flow velocities at Seward and Malaspina glaciers as well as culminating phases of surging at Klutlan, Walsh, and Kluane glaciers. On a broader scale, this technique can be used for reconstructing the response of worldwide glaciers to the warming climate using archived SAR data and for near-real-time monitoring of these glaciers using rapid revisit SAR data from satellites, such as Sentinel-1 (6 or 12 d revisit period) and the forthcoming NISAR mission (12 d revisit period).</p
Recommended from our members
SAR-derived flow velocity and its link to glacier surface elevation change and mass balance
Modern remote sensing techniques, such as Synthetic Aperture Radar (SAR), can measure the direction andintensity of glacier flow. Yet the question remains as to what these measurements reveal about glaciers’adjustment to the warming climate. Here, we present a technique that addresses this question by linking the SARderivedvelocity measurements with the glacier elevation change and the specific mass balance (i.e. mass balanceper unit area). The technique computes the speckle offset tracking results from the north, east and vertical flowdisplacement time series, with the vertical component further split into a Surface Parallel Flow (SPF) advectioncomponent due to the motion along a glacier surface slope and a non-Surface Parallel Flow (nSPF). The latterlinks the glacier surface elevation change with the specific mass balance and strain rates. We apply this techniqueto ascending and descending Sentinel-1 data to derive the four-dimensional flow displacement time series forglaciers in southeast Alaska during 2016–2019. Time series extracted for a few characteristic regions demonstrateremarkable temporal variability in flow velocities. The seasonal signal observed in the nSPF component ismodeled using the Positive Degree Day model. This method can be used for computing either mass balance orglacier surface elevation change if one of these two parameters is known from external observations.</p
Systematic procedural and sensitivity analysis of the pattern informatics method for forecasting large (M > 5) earthquake events in southern California
Recent studies in the literature have introduced a new approach to earthquake
forecasting based on representing the space-time patterns of localized
seismicity by a time-dependent system state vector in a real-valued Hilbert
space and deducing information about future space-time fluctuations from the
phase angle of the state vector. While the success rate of this Pattern
Informatics (PI) method has been encouraging, the method is still in its
infancy. Procedural analysis, statistical testing, parameter sensitivity
investigation and optimization all still need to be performed. In this paper,
we attempt to optimize the PI approach by developing quantitative values for
"predictive goodness" and analyzing possible variations in the proposed
procedure. In addition, we attempt to quantify the systematic dependence on the
quality of the input catalog of historic data and develop methods for combining
catalogs from regions of different seismic rates.Comment: 39 pages, 4 tables, 9 figures. Submitted to Pure and Applied
Geophysics on 30 November 200
Recommended from our members
Changing the Culture of Fieldwork in the Geosciences
Field-based investigations are an integral part of university-based research programs in the geosciences and frequently take scientists to near and far corners of the globe, from populated urban environs to remote wilderness areas and all types of locations in between. As a result, scientists find themselves in situations that can be both empowering—allowing them to succeed in challenging environments through synergistic teamwork—and intimidating, such as when unfamiliar surroundings or conditions push comfort zones or when one’s colleagues in the field pose unexpected or unwelcome hazards.
</p
- …