37,424 research outputs found

    Particle Density Estimation with Grid-Projected Adaptive Kernels

    Full text link
    The reconstruction of smooth density fields from scattered data points is a procedure that has multiple applications in a variety of disciplines, including Lagrangian (particle-based) models of solute transport in fluids. In random walk particle tracking (RWPT) simulations, particle density is directly linked to solute concentrations, which is normally the main variable of interest, not just for visualization and post-processing of the results, but also for the computation of non-linear processes, such as chemical reactions. Previous works have shown the superiority of kernel density estimation (KDE) over other methods such as binning, in terms of its ability to accurately estimate the "true" particle density relying on a limited amount of information. Here, we develop a grid-projected KDE methodology to determine particle densities by applying kernel smoothing on a pilot binning; this may be seen as a "hybrid" approach between binning and KDE. The kernel bandwidth is optimized locally. Through simple implementation examples, we elucidate several appealing aspects of the proposed approach, including its computational efficiency and the possibility to account for typical boundary conditions, which would otherwise be cumbersome in conventional KDE

    T-PHOT version 2.0: improved algorithms for background subtraction, local convolution, kernel registration, and new options

    Full text link
    We present the new release v2.0 of T-PHOT, a publicly available software package developed to perform PSF-matched, prior-based, multiwavelength deconfusion photometry of extragalactic fields. New features included in the code are presented and discussed: background estimation, fitting using position dependent kernels, flux prioring, diagnostical statistics on the residual image, exclusion of selected sources from the model and residual images, individual registration of fitted objects. These new options improve on the performance of the code, allowing for more accurate results and providing useful aids for diagnostics.Comment: 7 pages, 8 figure

    Inference for variograms

    Get PDF
    The empirical variogram is a standard tool in the investigation and modelling of spatial covariance. However, its properties can be difficult to identify and exploit in the context of exploring the characteristics of individual datasets. This is particularly true when seeking to move beyond description towards inferential statements about the structure of the spatial covariance which may be present. A robust form of empirical variogram based on a fourth-root transformation is used. This takes advantage of the normal approximation which gives an excellent description of the variation exhibited on this scale. Calculations of mean, variance and covariance of the binned empirical variogram then allow useful computations such as confidence intervals to be added to the underlying estimator. The comparison of variograms for different datasets provides an illustration of this. The suitability of simplifying assumptions such as isotropy and stationarity can then also be investigated through the construction of appropriate test statistics and the distributional calculations required in the associated p-values can be performed through quadratic form methods. Examples of the use of these methods in assessing the form of spatial covariance present in datasets are shown, both through hypothesis tests and in graphical form. A simulation study explores the properties of the tests while pollution data on mosses in Galicia (North-West Spain) are used to provide a real data illustration

    Multi-scale gapped smoothing algorithm for robust baseline-free damage detection in optical infrared thermography

    Get PDF
    Flash thermography is a promising technique to perform rapid non-destructive testing of composite materials. However, it is well known that several difficulties are inherently paired with this approach, such as non-uniform heating, measurement noise and lateral heat diffusion effects. Hence, advanced signal-processing techniques are indispensable in order to analyze the recorded dataset. One such processing technique is Gapped Smoothing Algorithm, which predicts a gapped pixel’s value in its sound state from a measurement in the defected state by evaluating only its neighboring pixels. However, the standard Gapped Smoothing Algorithm uses a fixed spatial gap size, which induces issues to detect variable defect sizes in a noisy dataset. In this paper, a Multi-Scale Gapped Smoothing Algorithm (MSGSA) is introduced as a baseline-free image processing technique and an extension to the standard Gapped Smoothing Algorithm. The MSGSA makes use of the evaluation of a wide range of spatial gap sizes so that defects of highly different dimensions are identified. Moreover, it is shown that a weighted combination of all assessed spatial gap sizes significantly improves the detectability of defects and results in an (almost) zero-reference background. The technique thus effectively suppresses the measurement noise and excitation non-uniformity. The efficiency of the MSGSA technique is evaluated and confirmed through numerical simulation and an experimental procedure of flash thermography on carbon fiber reinforced polymers with various defect sizes
    • …
    corecore