1,222 research outputs found

    Maximum A Posteriori Resampling of Noisy, Spatially Correlated Data

    Get PDF
    In any geologic application, noisy data are sources of consternation for researchers, inhibiting interpretability and marring images with unsightly and unrealistic artifacts. Filtering is the typical solution to dealing with noisy data. However, filtering commonly suffers from ad hoc (i.e., uncalibrated, ungoverned) application. We present here an alternative to filtering: a newly developed method for correcting noise in data by finding the “best” value given available information. The motivating rationale is that data points that are close to each other in space cannot differ by “too much,” where “too much” is governed by the field covariance. Data with large uncertainties will frequently violate this condition and therefore ought to be corrected, or “resampled.” Our solution for resampling is determined by the maximum of the a posteriori density function defined by the intersection of (1) the data error probability density function (pdf) and (2) the conditional pdf, determined by the geostatistical kriging algorithm applied to proximal data values. A maximum a posteriori solution can be computed sequentially going through all the data, but the solution depends on the order in which the data are examined. We approximate the global a posteriori solution by randomizing this order and taking the average. A test with a synthetic data set sampled from a known field demonstrates quantitatively and qualitatively the improvement provided by the maximum a posteriori resampling algorithm. The method is also applied to three marine geology/geophysics data examples, demonstrating the viability of the method for diverse applications: (1) three generations of bathymetric data on the New Jersey shelf with disparate data uncertainties; (2) mean grain size data from the Adriatic Sea, which is a combination of both analytic (low uncertainty) and word-based (higher uncertainty) sources; and (3) side-scan backscatter data from the Martha\u27s Vineyard Coastal Observatory which are, as is typical for such data, affected by speckle noise. Compared to filtering, maximum a posteriori resampling provides an objective and optimal method for reducing noise, and better preservation of the statistical properties of the sampled field. The primary disadvantage is that maximum a posteriori resampling is a computationally expensive procedure

    Nonlinear Statistical Filtering and Applications to Segregation in Steels from Microprobe Images

    Get PDF
    Microprobe images of solidification studies are well known to be subject to a Poisson noise. That is, the radiation count at a pixel x for a certain element may be considered to be an observation of a Poisson random variable whose parameter is equal to the true chemical concentration of the element at x. By modeling the image as a random function, we are able to use geostatistical techniques to perform various filtering operations. This filtering of the image itself may be done using linear kriging. For explicitely nonlinear problems such as the estimation of the underlying histogram of the noisy image, or the estimation of the probability that locally the concentration passes a certain value (this probability is needed for segregation studies), it is usually not possible to use linear techniques as they give biased results. For this reason, we applied the nonlinear technique of Disjunctive Kriging to these nonlinear problems. Linear kriging needs only second order statistical models ( covariance functions or variograms) while disjunctive kriging needs bivariate distribution models. This approach 1s illustrated by examples of filtering of various X-ray mappings in steel samples

    Functional Generative Design: An Evolutionary Approach to 3D-Printing

    Full text link
    Consumer-grade printers are widely available, but their ability to print complex objects is limited. Therefore, new designs need to be discovered that serve the same function, but are printable. A representative such problem is to produce a working, reliable mechanical spring. The proposed methodology for discovering solutions to this problem consists of three components: First, an effective search space is learned through a variational autoencoder (VAE); second, a surrogate model for functional designs is built; and third, a genetic algorithm is used to simultaneously update the hyperparameters of the surrogate and to optimize the designs using the updated surrogate. Using a car-launcher mechanism as a test domain, spring designs were 3D-printed and evaluated to update the surrogate model. Two experiments were then performed: First, the initial set of designs for the surrogate-based optimizer was selected randomly from the training set that was used for training the VAE model, which resulted in an exploitative search behavior. On the other hand, in the second experiment, the initial set was composed of more uniformly selected designs from the same training set and a more explorative search behavior was observed. Both of the experiments showed that the methodology generates interesting, successful, and reliable spring geometries robust to the noise inherent in the 3D printing process. The methodology can be generalized to other functional design problems, thus making consumer-grade 3D printing more versatile.Comment: 8 pages, 12 figures, GECCO'1

    Image enhancement from a stabilised video sequence

    Get PDF
    The aim of video stabilisation is to create a new video sequence where the motions (i.e. rotations, translations) and scale differences between frames (or parts of a frame) have effectively been removed. These stabilisation effects can be obtained via digital video processing techniques which use the information extracted from the video sequence itself, with no need for additional hardware or knowledge about camera physical motion. A video sequence usually contains a large overlap between successive frames, and regions of the same scene are sampled at different positions. In this paper, this multiple sampling is combined to achieve images with a higher spatial resolution. Higher resolution imagery play an important role in assisting in the identification of people, vehicles, structures or objects of interest captured by surveillance cameras or by video cameras used in face recognition, traffic monitoring, traffic law reinforcement, driver assistance and automatic vehicle guidance systems

    Real-time modelling and interpolation of spatio-temporal marine pollution

    Get PDF
    Due to the complexity of the interactions involved in various dynamic systems, known physical, biological or chemical laws cannot adequately describe the dynamics behind these processes. The study of these systems thus depends on measurements often taken at various discrete spatial locations through time by noisy sensors. For this reason, scientists often necessitate interpolative, visualisation and analytical tools to deal with the large volumes of data common to these systems. The starting point of this study is the seminal research by C. Shannon on sampling and reconstruction theory and its various extensions. Based on recent work on the reconstruction of stochastic processes, this paper develops a novel real-time estimation method for non- stationary stochastic spatio-temporal behaviour based on the Integro-Di erence Equation (IDE). This meth- odology is applied to collected marine pollution data from a Norwegian fjord. Comparison of the results obtained by the proposed method with interpolators from state-of-the-art Geographical Information System (GIS) packages will show, that signifi cantly superior results are obtained by including the temporal evolution in the spatial interpolations.peer-reviewe

    On Point Spread Function modelling: towards optimal interpolation

    Get PDF
    Point Spread Function (PSF) modeling is a central part of any astronomy data analysis relying on measuring the shapes of objects. It is especially crucial for weak gravitational lensing, in order to beat down systematics and allow one to reach the full potential of weak lensing in measuring dark energy. A PSF modeling pipeline is made of two main steps: the first one is to assess its shape on stars, and the second is to interpolate it at any desired position (usually galaxies). We focus on the second part, and compare different interpolation schemes, including polynomial interpolation, radial basis functions, Delaunay triangulation and Kriging. For that purpose, we develop simulations of PSF fields, in which stars are built from a set of basis functions defined from a Principal Components Analysis of a real ground-based image. We find that Kriging gives the most reliable interpolation, significantly better than the traditionally used polynomial interpolation. We also note that although a Kriging interpolation on individual images is enough to control systematics at the level necessary for current weak lensing surveys, more elaborate techniques will have to be developed to reach future ambitious surveys' requirements.Comment: Accepted for publication in MNRA

    Detail Enhancing Denoising of Digitized 3D Models from a Mobile Scanning System

    Get PDF
    The acquisition process of digitizing a large-scale environment produces an enormous amount of raw geometry data. This data is corrupted by system noise, which leads to 3D surfaces that are not smooth and details that are distorted. Any scanning system has noise associate with the scanning hardware, both digital quantization errors and measurement inaccuracies, but a mobile scanning system has additional system noise introduced by the pose estimation of the hardware during data acquisition. The combined system noise generates data that is not handled well by existing noise reduction and smoothing techniques. This research is focused on enhancing the 3D models acquired by mobile scanning systems used to digitize large-scale environments. These digitization systems combine a variety of sensors – including laser range scanners, video cameras, and pose estimation hardware – on a mobile platform for the quick acquisition of 3D models of real world environments. The data acquired by such systems are extremely noisy, often with significant details being on the same order of magnitude as the system noise. By utilizing a unique 3D signal analysis tool, a denoising algorithm was developed that identifies regions of detail and enhances their geometry, while removing the effects of noise on the overall model. The developed algorithm can be useful for a variety of digitized 3D models, not just those involving mobile scanning systems. The challenges faced in this study were the automatic processing needs of the enhancement algorithm, and the need to fill a hole in the area of 3D model analysis in order to reduce the effect of system noise on the 3D models. In this context, our main contributions are the automation and integration of a data enhancement method not well known to the computer vision community, and the development of a novel 3D signal decomposition and analysis tool. The new technologies featured in this document are intuitive extensions of existing methods to new dimensionality and applications. The totality of the research has been applied towards detail enhancing denoising of scanned data from a mobile range scanning system, and results from both synthetic and real models are presented

    How Different Analysis and Interpolation Methods Affect the Accuracy of Ice Surface Elevation Changes Inferred from Satellite Altimetry

    Get PDF
    Satellite altimetry has been widely used to determine surface elevation changes in polar ice sheets. The original height measurements are irregularly distributed in space and time. Gridded surface elevation changes are commonly derived by repeat altimetry analysis (RAA) and subsequent spatial interpolation of height change estimates. This article assesses how methodological choices related to those two steps affect the accuracy of surface elevation changes, and how well this accuracy is represented by formal uncertainties. In a simulation environment resembling CryoSat-2 measurements acquired over a region in northeast Greenland between December 2010 and January 2014, different local topography modeling approaches and different cell sizes for RAA, and four interpolation approaches are tested. Among the simulated cases, the choice of either favorable or unfavorable RAA affects the accuracy of results by about a factor of 6, and the different accuracy levels are propagated into the results of interpolation. For RAA, correcting local topography by an external digital elevation model (DEM) is best, if a very precise DEM is available, which is not always the case. Yet the best DEM-independent local topography correction (nine-parameter model within a 3,000 m diameter cell) is comparable to the use of a perfect DEM, which exactly represents the ice sheet topography, on the same cell size. Interpolation by heterogeneous measurement-error-filtered kriging is significantly more accurate (on the order of 50% error reduction) than interpolation methods, which do not account for heterogeneous errors
    • …
    corecore