97 research outputs found

    Geometric and Level Set Tomography using Ensemble Kalman Inversion

    Get PDF
    Tomography is one of the cornerstones of geophysics, enabling detailed spatial descriptions of otherwise invisible processes. However, due to the fundamental ill-posedness of tomography problems, the choice of parametrizations and regularizations for inversion significantly affect the result. Parametrizations for geophysical tomography typically reflect the mathematical structure of the inverse problem. We propose, instead, to parametrize the tomographic inverse problem using a geologically motivated approach. We build a model from explicit geological units that reflect the a priori knowledge of the problem. To solve the resulting large-scale nonlinear inverse problem, we employ the efficient Ensemble Kalman Inversion scheme, a highly parallelizable, iteratively regularizing optimizer that uses the ensemble Kalman filter to perform a derivative-free approximation of the general iteratively regularized Levenberg–Marquardt method. The combination of a model specification framework that explicitly encodes geological structure and a robust, derivative-free optimizer enables the solution of complex inverse problems involving non-differentiable forward solvers and significant a priori knowledge. We illustrate the model specification framework using synthetic and real data examples of near-surface seismic tomography using the factored eikonal fast marching method as a forward solver for first arrival traveltimes. The geometrical and level set framework allows us to describe geophysical hypotheses in concrete terms, and then optimize and test these hypotheses, helping us to answer targeted geophysical questions

    Rayleigh-Wave H/V via Noise Cross Correlation in Southern California

    Get PDF
    We study the crustal structure of southern California by inverting horizontal‐to‐vertical (H/V) amplitudes of Rayleigh waves observed in noise cross‐correlation signals. This study constitutes a useful addition to traditional phase‐velocity‐based tomographic inversions due to the localized sensitivity of H/V measurements to the near surface of the measurement station site. The continuous data of 222 permanent broadband stations of the Southern California Seismic Network (SCSN) were used in production of noise cross‐correlation waveforms, resulting in a spatially dense set of measurements for the southern California region in the 1–15 s period band. The fine interstation spacing of the SCSN allows retrieval of high signal‐to‐noise ratio Rayleigh waves at periods as low as 1 s, significantly improving the vertical resolution of the resulting tomographic image, compared to previous studies with minimum periods of 5–10 s. In addition, horizontal resolution is naturally improved by increased station density. Tectonic subregions including the Los Angeles basin and Salton trough are clearly visible due to their high short‐period H/V ratios, whereas the Transverse and Peninsular Ranges exhibit low H/V at all periods

    Did Oldham Discover the Core After All? Handling Imprecise Historical Data with Hierarchical Bayesian Model Selection Methods

    Get PDF
    Historical seismic data are essential to fill in the gaps in geophysical knowledge caused by the low rate of significant seismic events. Handling historical data in the context of geophysical inverse problems requires special care, due to the large errors in the data collection process. Using Oldham’s data for the discovery of Earth’s core as a case study, we illustrate how a hierarchical Bayesian model selection methodology using leave‐one‐out cross validation can robustly and efficiently answer quantitative questions using even poor‐quality geophysical data. We find that there is statistically significant evidence for the existence of the core using only the P‐wave data that Oldham effectively discarded in his discussion

    Probabilistic lowermost mantle P-wave tomography from hierarchical Hamiltonian Monte Carlo and model parametrization cross-validation

    Get PDF
    Bayesian methods, powered by Markov Chain Monte Carlo estimates of posterior densities, have become a cornerstone of geophysical inverse theory. These methods have special relevance to the deep Earth, where data are sparse and uncertainties are large. We present a strategy for efficiently solving hierarchical Bayesian geophysical inverse problems for fixed parametrizations using Hamiltonian Monte Carlo sampling, and highlight an effective methodology for determining optimal parametrizations from a set of candidates by using efficient approximations to leave-one-out cross-validation for model complexity. To illustrate these methods, we use a case study of differential traveltime tomography of the lowermost mantle, using short period P-wave data carefully selected to minimize the contributions of the upper mantle and inner core. The resulting tomographic image of the lowermost mantle has a relatively weak degree 2-instead there is substantial heterogeneity at all low spherical harmonic degrees less than 15. This result further reinforces the dichotomy in the lowermost mantle between relatively simple degree 2 dominated long-period S-wave tomographic models, and more complex short-period P-wave tomographic models.JBM would like to thank the General Sir John Monash Foundation and the Origin Energy Foundation for financial support

    Rayleigh-Wave H/V via Noise Cross Correlation in Southern California

    Get PDF
    We study the crustal structure of southern California by inverting horizontal‐to‐vertical (H/V) amplitudes of Rayleigh waves observed in noise cross‐correlation signals. This study constitutes a useful addition to traditional phase‐velocity‐based tomographic inversions due to the localized sensitivity of H/V measurements to the near surface of the measurement station site. The continuous data of 222 permanent broadband stations of the Southern California Seismic Network (SCSN) were used in production of noise cross‐correlation waveforms, resulting in a spatially dense set of measurements for the southern California region in the 1–15 s period band. The fine interstation spacing of the SCSN allows retrieval of high signal‐to‐noise ratio Rayleigh waves at periods as low as 1 s, significantly improving the vertical resolution of the resulting tomographic image, compared to previous studies with minimum periods of 5–10 s. In addition, horizontal resolution is naturally improved by increased station density. Tectonic subregions including the Los Angeles basin and Salton trough are clearly visible due to their high short‐period H/V ratios, whereas the Transverse and Peninsular Ranges exhibit low H/V at all periods

    JASPer controls interphase histone H3S10 phosphorylation by chromosomal kinase JIL-1 in Drosophila

    Get PDF
    In flies, the chromosomal kinase JIL-1 is responsible for most interphase histone H3S10 phosphorylation and has been proposed to protect active chromatin from acquiring heterochromatic marks, such as dimethylated histone H3K9 (H3K9me2) and HP1. Here, we show that JIL-1's targeting to chromatin depends on a PWWP domain-containing protein JASPer (JIL-1 Anchoring and Stabilizing Protein). JASPer-JIL-1 (JJ)-complex is the major form of kinase in vivo and is targeted to active genes and telomeric transposons via binding of the PWWP domain of JASPer to H3K36me3 nucleosomes, to modulate transcriptional output. JIL-1 and JJ-complex depletion in cycling cells lead to small changes in H3K9me2 distribution at active genes and telomeric transposons. Finally, we identify interactors of the endogenous JJ-complex and propose that JIL-1 not only prevents heterochromatin formation but also coordinates chromatin-based regulation in the transcribed part of the genome

    Effects of the number of markers per haplotype and clustering of haplotypes on the accuracy of QTL mapping and prediction of genomic breeding values

    Get PDF
    The aim of this paper was to compare the effect of haplotype definition on the precision of QTL-mapping and on the accuracy of predicted genomic breeding values. In a multiple QTL model using identity-by-descent (IBD) probabilities between haplotypes, various haplotype definitions were tested i.e. including 2, 6, 12 or 20 marker alleles and clustering base haplotypes related with an IBD probability of > 0.55, 0.75 or 0.95. Simulated data contained 1100 animals with known genotypes and phenotypes and 1000 animals with known genotypes and unknown phenotypes. Genomes comprising 3 Morgan were simulated and contained 74 polymorphic QTL and 383 polymorphic SNP markers with an average r2 value of 0.14 between adjacent markers. The total number of haplotypes decreased up to 50% when the window size was increased from two to 20 markers and decreased by at least 50% when haplotypes related with an IBD probability of > 0.55 instead of > 0.95 were clustered. An intermediate window size led to more precise QTL mapping. Window size and clustering had a limited effect on the accuracy of predicted total breeding values, ranging from 0.79 to 0.81. Our conclusion is that different optimal window sizes should be used in QTL-mapping versus genome-wide breeding value prediction
    • 

    corecore