155 research outputs found

    A differential semblance algorithm for the inverse problem of reflection seismology

    Get PDF
    AbstractThis paper presents an analysis of stability and convergence for a special case of differential semblance optimization (DSO). This approach to model estimation for reflection seismology is a variant of the output least squares inversion of seismograms, enjoying analytical and numerical properties superior to those of more straightforward versions. We study a specialization of DSO appropriate to the inversion of convolutional-approximation planewave seismograms over layered constant-density acoustic media. We prove that the differential semblance variational principle is locally convex in suitable model classes for a range of data noise. Moreover, the structure of the convexity estimates suggest a family of quasi-Newton algorithms. We describe an implementation of one of these algorithms, and present some numerical results

    A mathematical framework for inverse wave problems in heterogeneous media

    Full text link
    This paper provides a theoretical foundation for some common formulations of inverse problems in wave propagation, based on hyperbolic systems of linear integro-differential equations with bounded and measurable coefficients. The coefficients of these time-dependent partial differential equations respresent parametrically the spatially varying mechanical properties of materials. Rocks, manufactured materials, and other wave propagation environments often exhibit spatial heterogeneity in mechanical properties at a wide variety of scales, and coefficient functions representing these properties must mimic this heterogeneity. We show how to choose domains (classes of nonsmooth coefficient functions) and data definitions (traces of weak solutions) so that optimization formulations of inverse wave problems satisfy some of the prerequisites for application of Newton's method and its relatives. These results follow from the properties of a class of abstract first-order evolution systems, of which various physical wave systems appear as concrete instances. Finite speed of propagation for linear waves with bounded, measurable mechanical parameter fields is one of the by-products of this theory

    An adaptive multiscale algorithm for efficient extended waveform inversion

    Get PDF
    Subsurface-offset extended full-waveform inversion (FWI) may converge to kinematically accurate velocity models without the low-frequency data accuracy required for standard data-domain FWI. However, this robust alternative approach to waveform inversion suffers from a very high computational cost resulting from its use of nonlocal wave physics: The computation of strain from stress involves an integral over the subsurface offset axis, which must be performed at every space-time grid point. We found that a combination of data-fit driven offset limits, grid coarsening, and low-pass data filtering can reduce the cost of extended inversion by one to two orders of magnitude

    IWAVE Implementation of Born Simulation

    Get PDF
    The single-scattering (or Born) approximation is the most fundamental assumption shared by all seismic imaging methods, and plays a crutial role in the non-linear waveform inversion, an iterative process of linearized inversions. The Born simulator (linearized forward map) shares a computational core with the corresponding simulator (forward map), which has been well implemented in the modeling package IWAVE. This report focuses on implementing the Born simulator based on IWAVE, and reviews the main adaptations we made in IWAVE to accommodate such an implementation in C++. Our goal is to construct a C++ wrapper of IWAVE, which fits into a general framework for inversion. This report is the first of several describing an implementation of such a wrapper

    Getting It Right Without Knowing the Answer: Quality Control in a Large Seismic Modeling Project

    Get PDF
    Phase I of the SEAM Project will produce a variable-density acoustic synthetic survey over a 3D geological model simulating a deepwater subsalt exploration target. Due to the intended use of the data, the project places a premium on accuracy. Commercially produced Phase I synthetics will be spot-checked against benchmark simulations to assure quality. Thus the accuracy of the benchmark simulator required careful assessment. The authors designed and implemented the benchmark simulator used in this program, subjected it to verification tests, and assisted in the qualification phase of the Phase I project. The key lessons that we have learned so far from this assessment are that (1) the few verification tools available to us - a few analytic solutions and Richardson extrapolation - seem to be adequate, at least in a rough way, and (2) the standard approach to this type of simulation - finite difference methods on regular grids - requires surprisingly fine grid steps to produce small relative RMS errors for models of the type defined by this project

    Task Level Parallelization for Seismic Modeling and Inversion

    Get PDF
    Projet IDENTThis paper present experience with using PVM to parallelize \Dseismic inversion code developed in the Rice Inversion Project. We use coarse grain parallelism to dynamically distribute simulations over several workstations. When doing modeling, this strategy works efficiently, allowing us to reach an speedup of almost 4.5 on a cluster of 6 IBM RS6000 workstations. When doing inversion, however, we are currently limited to speedups of 2.4 on 3 workstations

    An Infeasible Point Method for Minimizing the Lennard-Jones Potential

    Full text link
    Minimizing the Lennard-Jones potential, the most-studied modelproblem for molecular conformation, is an unconstrained globaloptimization problem with a large number of local minima. In thispaper, the problem is reformulated as an equality constrainednonlinear programming problem with only linear constraints. Thisformulation allows the solution to approached through infeasibleconfigurations, increasing the basin of attraction of the globalsolution. In this way the likelihood of finding a global minimizeris increased. An algorithm for solving this nonlinear program isdiscussed, and results of numerical tests are presented.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/44788/1/10589_2004_Article_140555.pd

    Assessing the cost of global biodiversity and conservation knowledge

    Get PDF
    Knowledge products comprise assessments of authoritative information supported by stan-dards, governance, quality control, data, tools, and capacity building mechanisms. Considerable resources are dedicated to developing and maintaining knowledge productsfor biodiversity conservation, and they are widely used to inform policy and advise decisionmakers and practitioners. However, the financial cost of delivering this information is largelyundocumented. We evaluated the costs and funding sources for developing and maintain-ing four global biodiversity and conservation knowledge products: The IUCN Red List ofThreatened Species, the IUCN Red List of Ecosystems, Protected Planet, and the WorldDatabase of Key Biodiversity Areas. These are secondary data sets, built on primary datacollected by extensive networks of expert contributors worldwide. We estimate that US160million(range:US160million (range: US116–204 million), plus 293 person-years of volunteer time (range: 278–308 person-years) valued at US14million(rangeUS 14 million (range US12–16 million), were invested inthese four knowledge products between 1979 and 2013. More than half of this financingwas provided through philanthropy, and nearly three-quarters was spent on personnelcosts. The estimated annual cost of maintaining data and platforms for three of these knowl-edge products (excluding the IUCN Red List of Ecosystems for which annual costs were notpossible to estimate for 2013) is US6.5millionintotal(range:US6.5 million in total (range: US6.2–6.7 million). We esti-mated that an additional US114millionwillbeneededtoreachpre−definedbaselinesofdatacoverageforallthefourknowledgeproducts,andthatonceachieved,annualmainte−nancecostswillbeapproximatelyUS114 million will be needed to reach pre-defined baselines ofdata coverage for all the four knowledge products, and that once achieved, annual mainte-nance costs will be approximately US12 million. These costs are much lower than those tomaintain many other, similarly important, global knowledge products. Ensuring that biodi-versity and conservation knowledge products are sufficiently up to date, comprehensiveand accurate is fundamental to inform decision-making for biodiversity conservation andsustainable development. Thus, the development and implementation of plans for sustain-able long-term financing for them is critical
    • …
    corecore