6 research outputs found

    Data Assimilation for a Geological Process Model Using the Ensemble Kalman Filter

    Full text link
    We consider the problem of conditioning a geological process-based computer simulation, which produces basin models by simulating transport and deposition of sediments, to data. Emphasising uncertainty quantification, we frame this as a Bayesian inverse problem, and propose to characterize the posterior probability distribution of the geological quantities of interest by using a variant of the ensemble Kalman filter, an estimation method which linearly and sequentially conditions realisations of the system state to data. A test case involving synthetic data is used to assess the performance of the proposed estimation method, and to compare it with similar approaches. We further apply the method to a more realistic test case, involving real well data from the Colville foreland basin, North Slope, Alaska.Comment: 34 pages, 10 figures, 4 table

    A revised implicit equal-weights particle filter

    Get PDF
    Particle filters are fully non-linear data assimilation methods and as such are highly relevant. While the standard particle filter degenerates for high-dimensional systems, recent developments have opened the way for new particle filters that can be used in such systems. The implicit equal-weights particle filter (IEWPF) is an efficient approach which avoids filter degeneracy because it gives equal particle weights by construction. The method uses implicit sampling whereby auxiliary vectors drawn from a proposal distribution undergo a transformation before they are added to each particle. In the original formulation of the IEWPF, the proposal distribution has a gap causing all but one particle to have an inaccessible region in state space. We show that this leads to a systematic bias in the predictions and we modify the proposal distribution to eliminate the gap. We achieved this by using a two-stage proposal method, where a single variance parameter is tuned to obtain adequate statistical coverage properties of the predictive distribution. We discuss properties of the implicit mapping from an auxiliary random vector to the state vector, keeping in mind the aim of avoiding particle resampling. The revised filter is tested on linear and weakly nonlinear dynamical models in low-dimensional and moderately high-dimensional settings, demonstrating the suiccess of the new methodology in removing the bias

    Parametric Wavelet Estimation

    No full text
    A method for parametric estimation of seismic wavelets from well logs and seismic data is developed. Parameters include amplitude, skewness, length and fluctuation order, and the link between parameters and wavelet properties provides a user-friendly interpretation of the wavelet function. The method is set in a Bayesian framework, and is well-suited for addressing questions about uncertainty related to estimated wavelets. This is accomplished by sampling the posterior distribution using Markov Chain Monte Carlo methods. The estimation method is framed as a practical step-wise procedure. An extension of the model to enable joint wavelet estimation from seismic data with multiple incidence angles, is also described. The method is tested on simulated data, and on well log and seismic amplitude data from the North Sea. The results in the synthetic case indicate that the method performs well under idealised conditions. When tested on real data, the method produces a realistic wavelet fit and uncertainty range. Uncertainty is substantially reduced from the prior to the posterior distribution, but in general, the shape of the posterior surface could make it hard to explore. A comparison with a wavelet estimator based on a Gaussian process indicates that the proposed parametric form gives a tighter wavelet, and is less prone to overfitting

    Ensemble-based data assimilation methods applied to geological process modeling

    No full text
    Summary: Data assimilation is the art of conditioning a numerical simulation of a physical process on observations of the real process. That is, adjusting estimates so that they agree not only with a mathematical model of reality, but also with direct measurements. Data assimilation is essential for a host of geophysical applications form circulation models of the atmosphere and oceans to weather forecasting and prediction of floods and droughts. Methods for data assimilation rely on a statistical description of a physical simulation model on the one hand, and of a data generating process on the other hand. Combining the information from these two sources in a consistent way leads to a compromise between theoretical simulation and empirical observation. The actual state of a complex physical system, even an idealized and simplified one, is almost always underdetermined by data. Multiple possible states could have given rise to the same observations, so in practice the truth can never be uniquely determined. An ensemble, in this context, is a set of versions or realizations of the state of the simulated system. The realizations differ from each other by random variation, and the variation is intended to reflect the epistemic uncertainty associated with the actual system state. In ensemble-based data assimilation methods, an ensemble is formed by simulating each realization forward through time from an initial point to a later point. The ensemble is then compared with observations of the system at that point in time, and based on the comparison the ensemble is manipulated so that the distance to observations is reduced. Thus, information from data is assimilated into the statistical model. The dissertation concerns the application of this type of method to a geologic process model, or more precisely a stratigraphic forward model, which simulates the formation of sedimentary rocks by deposition of sand, silt and clay. Usually there are no observations of this process unfolding. There may however be data related to the result, namely the finished sedimentary layer structure. An example of such data could be measurements taken at varying depths in a well drilled for petroleum exploration purposes. Since the sedimentary rocks are arranged chronologically, with older layers situated below younger layers, it is possible to start at the bottom and assimilate measurements of ever-younger layers while simultaneously simulating the sedimentation process forward through time. The result is a geologic simulation that is conditional on the available observations

    Parametric spatial covariance models in the ensemble Kalman filter

    No full text
    Several applications rely on data assimilation methods for complex spatio-temporal problems. The focus of this paper is on ensemble-based methods, where some approaches require estimation of covariances between state variables and observations in the assimilation step. Spurious correlations present a challenge in such cases as they can degrade the quality of the ensemble representation of probability distributions. In particular, prediction variability is often underestimated. We propose to replace the sample covariance estimate by a parametric approach using maximum likelihood estimation for a small number of parameters in a spatial covariance model. Parametric covariance and precision estimation are employed in the context of the ensemble Kalman filter, and applied to a Gauss-linear autoregressive model and a geological process model. We learn that parametric approaches reduce the underestimation in prediction variability. Furthermore rich, non-stationary models do not seem to add much over simpler models with fewer parameters

    Læringsressurser i grunnutdanningen i matematikk - kvalitet, tilgjengelighet og differensiering

    No full text
    For studenter ved sivilingeniørutdanningen ved NTNU inngår fem grunnemner i matematikk i studieplanen. Tre av disse emnene, Matematikk 1 (første semester, 1700 studenter), Matematikk 2 (andre semester, 1400 studenter) og Statistikk (800 studenter i tredje og fjerde semester), har i 2014-2016 gått gjennom store endringer av pedagogisk art i NTNU-prosjektet KTDiM (Kvalitet, tilgjengelighet og differensiering i grunnundervisningen i matematikk). Hovedmålet i prosjektet har vært å øke læringsutbyttet for studentene, sammenlignet med klassiske undervisningsopplegg. Tanken er at det økte læringsutbyttet kommer fordi studentene utvikler dypere forståelse for matematiske begreper og prosesser, noe som igjen gjør dem bedre skikket enn tidligere til å bruke matematikk og statistikk i anvendelser. Studentenes egenrapporterte aktivitet og preferanser, og tall for bruk av de elektroniske ressursene, danner grunnlaget for konklusjonene vi trekker om studentenes holdning til og bruk av læringsressursene i grunnutdanningen i matematikk
    corecore