36,810 research outputs found

    A variational approach to path estimation and parameter inference of hidden diffusion processes

    Full text link
    We consider a hidden Markov model, where the signal process, given by a diffusion, is only indirectly observed through some noisy measurements. The article develops a variational method for approximating the hidden states of the signal process given the full set of observations. This, in particular, leads to systematic approximations of the smoothing densities of the signal process. The paper then demonstrates how an efficient inference scheme, based on this variational approach to the approximation of the hidden states, can be designed to estimate the unknown parameters of stochastic differential equations. Two examples at the end illustrate the efficacy and the accuracy of the presented method.Comment: 37 pages, 2 figures, revise

    Optimal rates of convergence for persistence diagrams in Topological Data Analysis

    Full text link
    Computational topology has recently known an important development toward data analysis, giving birth to the field of topological data analysis. Topological persistence, or persistent homology, appears as a fundamental tool in this field. In this paper, we study topological persistence in general metric spaces, with a statistical approach. We show that the use of persistent homology can be naturally considered in general statistical frameworks and persistence diagrams can be used as statistics with interesting convergence properties. Some numerical experiments are performed in various contexts to illustrate our results

    Continuity of the Maximum-Entropy Inference

    Full text link
    We study the inverse problem of inferring the state of a finite-level quantum system from expected values of a fixed set of observables, by maximizing a continuous ranking function. We have proved earlier that the maximum-entropy inference can be a discontinuous map from the convex set of expected values to the convex set of states because the image contains states of reduced support, while this map restricts to a smooth parametrization of a Gibbsian family of fully supported states. Here we prove for arbitrary ranking functions that the inference is continuous up to boundary points. This follows from a continuity condition in terms of the openness of the restricted linear map from states to their expected values. The openness condition shows also that ranking functions with a discontinuous inference are typical. Moreover it shows that the inference is continuous in the restriction to any polytope which implies that a discontinuity belongs to the quantum domain of non-commutative observables and that a geodesic closure of a Gibbsian family equals the set of maximum-entropy states. We discuss eight descriptions of the set of maximum-entropy states with proofs of accuracy and an analysis of deviations.Comment: 34 pages, 1 figur

    Expert systems and finite element structural analysis - a review

    Get PDF
    Finite element analysis of many engineering systems is practised more as an art than as a science . It involves high level expertise (analytical as well as heuristic) regarding problem modelling (e .g. problem specification,13; choosing the appropriate type of elements etc .), optical mesh design for achieving the specified accuracy (e .g . initial mesh selection, adaptive mesh refinement), selection of the appropriate type of analysis and solution13; routines and, finally, diagnosis of the finite element solutions . Very often such expertise is highly dispersed and is not available at a single place with a single expert. The design of an expert system, such that the necessary expertise is available to a novice to perform the same job even in the absence of trained experts, becomes an attractive proposition. 13; In this paper, the areas of finite element structural analysis which require experience and decision-making capabilities are explored . A simple expert system, with a feasible knowledge base for problem modelling, optimal mesh design, type of analysis and solution routines, and diagnosis, is outlined. Several efforts in these directions, reported in the open literature, are also reviewed in this paper

    Robust Geometry Estimation using the Generalized Voronoi Covariance Measure

    Get PDF
    The Voronoi Covariance Measure of a compact set K of R^d is a tensor-valued measure that encodes geometric information on K and which is known to be resilient to Hausdorff noise but sensitive to outliers. In this article, we generalize this notion to any distance-like function delta and define the delta-VCM. We show that the delta-VCM is resilient to Hausdorff noise and to outliers, thus providing a tool to estimate robustly normals from a point cloud approximation. We present experiments showing the robustness of our approach for normal and curvature estimation and sharp feature detection

    Transform-based particle filtering for elliptic Bayesian inverse problems

    Get PDF
    We introduce optimal transport based resampling in adaptive SMC. We consider elliptic inverse problems of inferring hydraulic conductivity from pressure measurements. We consider two parametrizations of hydraulic conductivity: by Gaussian random field, and by a set of scalar (non-)Gaussian distributed parameters and Gaussian random fields. We show that for scalar parameters optimal transport based SMC performs comparably to monomial based SMC but for Gaussian high-dimensional random fields optimal transport based SMC outperforms monomial based SMC. When comparing to ensemble Kalman inversion with mutation (EKI), we observe that for Gaussian random fields, optimal transport based SMC gives comparable or worse performance than EKI depending on the complexity of the parametrization. For non-Gaussian distributed parameters optimal transport based SMC outperforms EKI

    Selective inference after feature selection via multiscale bootstrap

    Full text link
    It is common to show the confidence intervals or pp-values of selected features, or predictor variables in regression, but they often involve selection bias. The selective inference approach solves this bias by conditioning on the selection event. Most existing studies of selective inference consider a specific algorithm, such as Lasso, for feature selection, and thus they have difficulties in handling more complicated algorithms. Moreover, existing studies often consider unnecessarily restrictive events, leading to over-conditioning and lower statistical power. Our novel and widely-applicable resampling method addresses these issues to compute an approximately unbiased selective pp-value for the selected features. We prove that the pp-value computed by our resampling method is more accurate and more powerful than existing methods, while the computational cost is the same order as the classical bootstrap method. Numerical experiments demonstrate that our algorithm works well even for more complicated feature selection methods such as non-convex regularization.Comment: The title has changed (The previous title is "Selective inference after variable selection via multiscale bootstrap"). 23 pages, 11 figure
    corecore