100 research outputs found

    Discrete structures, algorithms, and applications

    Get PDF

    Data-driven quasi-interpolant spline surfaces for point cloud approximation

    Get PDF
    In this paper we investigate a local surface approximation, the Weighted Quasi Interpolant Spline Approximation (wQISA), specifically designed for large and noisy point clouds. We briefly describe the properties of the wQISA representation and introduce a novel data-driven implementation, which combines prediction capability and complexity efficiency. We provide an extended comparative analysis with other continuous approximations on real data, including different types of surfaces and levels of noise, such as 3D models, terrain data and digital environmental data

    Regional wave propagation using the discontinuous Galerkin method

    Get PDF
    We present an application of the discontinuous Galerkin (DG) method to regional wave propagation. The method makes use of unstructured tetrahedral meshes, combined with a time integration scheme solving the arbitrary high-order derivative (ADER) Riemann problem. This ADER-DG method is high-order accurate in space and time, beneficial for reliable simulations of high-frequency wavefields over long propagation distances. Due to the ease with which tetrahedral grids can be adapted to complex geometries, undulating topography of the Earth's surface and interior interfaces can be readily implemented in the computational domain. The ADER-DG method is benchmarked for the accurate radiation of elastic waves excited by an explosive and a shear dislocation source. We compare real data measurements with synthetics of the 2009 L'Aquila event (central Italy). We take advantage of the geometrical flexibility of the approach to generate a European model composed of the 3-D <i>EPcrust</i> model, combined with the depth-dependent <i>ak135</i> velocity model in the upper mantle. The results confirm the applicability of the ADER-DG method for regional scale earthquake simulations, which provides an alternative to existing methodologies

    Doctor of Philosophy

    Get PDF
    dissertationStatistical learning theory has garnered attention during the last decade because it provides the theoretical and mathematical framework for solving pattern recognition problems, such as dimensionality reduction, clustering, and shape analysis. In statis

    APPROXIMATION ALGORITHMS FOR POINT PATTERN MATCHING AND SEARCHI NG

    Get PDF
    Point pattern matching is a fundamental problem in computational geometry. For given a reference set and pattern set, the problem is to find a geometric transformation applied to the pattern set that minimizes some given distance measure with respect to the reference set. This problem has been heavily researched under various distance measures and error models. Point set similarity searching is variation of this problem in which a large database of point sets is given, and the task is to preprocess this database into a data structure so that, given a query point set, it is possible to rapidly find the nearest point set among elements of the database. Here, the term nearest is understood in above sense of pattern matching, where the elements of the database may be transformed to match the given query set. The approach presented here is to compute a low distortion embedding of the pattern matching problem into an (ideally) low dimensional metric space and then apply any standard algorithm for nearest neighbor searching over this metric space. This main focus of this dissertation is on two problems in the area of point pattern matching and searching algorithms: (i) improving the accuracy of alignment-based point pattern matching and (ii) computing low-distortion embeddings of point sets into vector spaces. For the first problem, new methods are presented for matching point sets based on alignments of small subsets of points. It is shown that these methods lead to better approximation bounds for alignment-based planar point pattern matching algorithms under the Hausdorff distance. Furthermore, it is shown that these approximation bounds are nearly the best achievable by alignment-based methods. For the second problem, results are presented for two different distance measures. First, point pattern similarity search under translation for point sets in multidimensional integer space is considered, where the distance function is the symmetric difference. A randomized embedding into real space under the L1 metric is given. The algorithm achieves an expected distortion of O(log2 n). Second, an algorithm is given for embedding Rd under the Earth Mover's Distance (EMD) into multidimensional integer space under the symmetric difference distance. This embedding achieves a distortion of O(log D), where D is the diameter of the point set. Combining this with the above result implies that point pattern similarity search with translation under the EMD can be embedded in to real space in the L1 metric with an expected distortion of O(log2 n log D)

    A Probabilistic Framework for Statistical Shape Models and Atlas Construction: Application to Neuroimaging

    Get PDF
    Accurate and reliable registration of shapes and multi-dimensional point sets describing the morphology/physiology of anatomical structures is a pre-requisite for constructing statistical shape models (SSMs) and atlases. Such statistical descriptions of variability across populations (regarding shape or other morphological/physiological quantities) are based on homologous correspondences across the multiple samples that comprise the training data. The notion of exact correspondence can be ambiguous when these data contain noise and outliers, missing data, or significant and abnormal variations due to pathology. But, these phenomena are common in medical image-derived data, due, for example, to inconsistencies in image quality and acquisition protocols, presence of motion artefacts, differences in pre-processing steps, and inherent variability across patient populations and demographics. This thesis therefore focuses on formulating a unified probabilistic framework for the registration of shapes and so-called \textit{generalised point sets}, which is robust to the anomalies and variations described. Statistical analysis of shapes across large cohorts demands automatic generation of training sets (image segmentations delineating the structure of interest), as manual and semi-supervised approaches can be prohibitively time consuming. However, automated segmentation and landmarking of images often result in shapes with high levels of outliers and missing data. Consequently, a robust method for registration and correspondence estimation is required. A probabilistic group-wise registration framework for point-based representations of shapes, based on Student’s t-mixture model (TMM) and a multi-resolution extension to the same (mrTMM), are formulated to this end. The frameworks exploit the inherent robustness of Student’s t-distributions to outliers, which is lacking in existing Gaussian mixture model (GMM)-based approaches. The registration accuracy of the proposed approaches was quantitatively evaluated and shown to outperform the state-of-the-art, using synthetic and clinical data. A corresponding improvement in the quality of SSMs generated subsequently was also shown, particularly for data sets containing high levels of noise. In general, the proposed approach requires fewer user specified parameters than existing methods, whilst affording much improved robustness to outliers. Registration of generalised point sets, which combine disparate features such as spatial positions, directional/axial data, and scalar-valued quantities, was studied next. A hybrid mixture model (HMM), combining different types of probability distributions, was formulated to facilitate the joint registration and clustering of multi-dimensional point sets of this nature. Two variants of the HMM were developed for modelling: (1) axial data; and (2) directional data. The former, based on a combination of Student’s t, Watson and Gaussian distributions, was used to register hybrid point sets comprising magnetic resonance diffusion tensor image (DTI)-derived quantities, such as voxel spatial positions (defining a region/structure of interest), associated fibre orientations, and scalar measures reflecting tissue anisotropy. The latter meanwhile, formulated using a combination of Student’s t and Von-Mises-Fisher distributions, is used for the registration of shapes represented as hybrid point sets comprising spatial positions and associated surface normal vectors. The Watson-variant of the HMM facilitates statistical analysis and group-wise comparisons of DTI data across patient populations, presented as an exemplar application of the proposed approach. The Fisher-variant of the HMM on the other hand, was used to register hybrid representations of shapes, providing substantial improvements over point-based registration approaches in terms of anatomical validity in the estimated correspondences

    Statistical computation with kernels

    Get PDF
    Modern statistical inference has seen a tremendous increase in the size and complexity of models and datasets. As such, it has become reliant on advanced com- putational tools for implementation. A first canonical problem in this area is the numerical approximation of integrals of complex and expensive functions. Numerical integration is required for a variety of tasks, including prediction, model comparison and model choice. A second canonical problem is that of statistical inference for models with intractable likelihoods. These include models with intractable normal- isation constants, or models which are so complex that their likelihood cannot be evaluated, but from which data can be generated. Examples include large graphical models, as well as many models in imaging or spatial statistics. This thesis proposes to tackle these two problems using tools from the kernel methods and Bayesian non-parametrics literature. First, we analyse a well-known algorithm for numerical integration called Bayesian quadrature, and provide consis- tency and contraction rates. The algorithm is then assessed on a variety of statistical inference problems, and extended in several directions in order to reduce its compu- tational requirements. We then demonstrate how the combination of reproducing kernels with Stein’s method can lead to computational tools which can be used with unnormalised densities, including numerical integration and approximation of probability measures. We conclude by studying two minimum distance estimators derived from kernel-based statistical divergences which can be used for unnormalised and generative models. In each instance, the tractability provided by reproducing kernels and their properties allows us to provide easily-implementable algorithms whose theoretical foundations can be studied in depth
    • …
    corecore