36,413 research outputs found

    Multi-Object Analysis of Volume, Pose, and Shape Using Statistical Discrimination

    Get PDF
    One goal of statistical shape analysis is the discrimination between two populations of objects. Whereas traditional shape analysis was mostly concerned with studying single objects, analysis of multi-object complexes presents new challenges related to alignment and relative object pose. In this paper, we present a methodology for discriminant analysis of sets multiple shapes. Shapes are represented by sampled medial manifolds including normals to the boundary. Non-Euclidean metrics that describe geodesic distance between sets of sampled representations are used for shape alignment and discrimination. Our choice of discriminant method is the distance weighted discriminant (DWD) because of its generalization ability in high dimensional, low sample size settings. Using an unbiased, soft discrimination score we can associate a statistical hypothesis test with the discrimination results. Furthermore, localization and nature significant differences between populations can be visualized via the average best discriminating axis

    Multi-object analysis of volume, pose, and shape using statistical discrimination

    Get PDF
    pre-printOne goal of statistical shape analysis is the discrimination between two populations of objects. Whereas traditional shape analysis was mostly concerned with single objects, analysis of multi-object complexes presents new challenges related to alignment and pose. In this paper, we present a methodology for discriminant analysis of multiple objects represented by sampled medial manifolds. Non-euclidean metrics that describe geodesic distances between sets of sampled representations are used for alignment and discrimination. Our choice of discriminant method is the distance-weighted discriminant because of its generalization ability in high-dimensional, low sample size settings. Using an unbiased, soft discrimination score, we associate a statistical hypothesis test with the discrimination results. We explore the effectiveness of different choices of features as input to the discriminant analysis, using measures like volume, pose, shape, and the combination of pose and shape. Our method is applied to a longitudinal pediatric autism study with 10 subcortical brain structures in a population of 70 subjects. It is shown that the choices of type of global alignment and of intrinsic versus extrinsic shape features, the latter being sensitive to relative pose, are crucial factors for group discrimination and also for explaining the nature of shape change in terms of the application domain

    Power Euclidean metrics for covariance matrices with application to diffusion tensor imaging

    Full text link
    Various metrics for comparing diffusion tensors have been recently proposed in the literature. We consider a broad family of metrics which is indexed by a single power parameter. A likelihood-based procedure is developed for choosing the most appropriate metric from the family for a given dataset at hand. The approach is analogous to using the Box-Cox transformation that is frequently investigated in regression analysis. The methodology is illustrated with a simulation study and an application to a real dataset of diffusion tensor images of canine hearts

    Empirical geodesic graphs and CAT(k) metrics for data analysis

    Full text link
    A methodology is developed for data analysis based on empirically constructed geodesic metric spaces. For a probability distribution, the length along a path between two points can be defined as the amount of probability mass accumulated along the path. The geodesic, then, is the shortest such path and defines a geodesic metric. Such metrics are transformed in a number of ways to produce parametrised families of geodesic metric spaces, empirical versions of which allow computation of intrinsic means and associated measures of dispersion. These reveal properties of the data, based on geometry, such as those that are difficult to see from the raw Euclidean distances. Examples of application include clustering and classification. For certain parameter ranges, the spaces become CAT(0) spaces and the intrinsic means are unique. In one case, a minimal spanning tree of a graph based on the data becomes CAT(0). In another, a so-called "metric cone" construction allows extension to CAT(kk) spaces. It is shown how to empirically tune the parameters of the metrics, making it possible to apply them to a number of real cases.Comment: Statistics and Computing, 201
    • …
    corecore