2,894 research outputs found

    Finsler geometry on higher order tensor fields and applications to high angular resolution diffusion imaging.

    Get PDF
    We study 3D-multidirectional images, using Finsler geometry. The application considered here is in medical image analysis, specifically in High Angular Resolution Diffusion Imaging (HARDI) (Tuch et al. in Magn. Reson. Med. 48(6):1358–1372, 2004) of the brain. The goal is to reveal the architecture of the neural fibers in brain white matter. To the variety of existing techniques, we wish to add novel approaches that exploit differential geometry and tensor calculus. In Diffusion Tensor Imaging (DTI), the diffusion of water is modeled by a symmetric positive definite second order tensor, leading naturally to a Riemannian geometric framework. A limitation is that it is based on the assumption that there exists a single dominant direction of fibers restricting the thermal motion of water molecules. Using HARDI data and higher order tensor models, we can extract multiple relevant directions, and Finsler geometry provides the natural geometric generalization appropriate for multi-fiber analysis. In this paper we provide an exact criterion to determine whether a spherical function satisfies the strong convexity criterion essential for a Finsler norm. We also show a novel fiber tracking method in Finsler setting. Our model incorporates a scale parameter, which can be beneficial in view of the noisy nature of the data. We demonstrate our methods on analytic as well as simulated and real HARDI data

    On k-Convex Polygons

    Get PDF
    We introduce a notion of kk-convexity and explore polygons in the plane that have this property. Polygons which are \mbox{kk-convex} can be triangulated with fast yet simple algorithms. However, recognizing them in general is a 3SUM-hard problem. We give a characterization of \mbox{22-convex} polygons, a particularly interesting class, and show how to recognize them in \mbox{O(nlogn)O(n \log n)} time. A description of their shape is given as well, which leads to Erd\H{o}s-Szekeres type results regarding subconfigurations of their vertex sets. Finally, we introduce the concept of generalized geometric permutations, and show that their number can be exponential in the number of \mbox{22-convex} objects considered.Comment: 23 pages, 19 figure

    The Average Projected Area Theorem - Generalization to Higher Dimensions

    Full text link
    In 3-d the average projected area of a convex solid is 1/4 the surface area, as Cauchy showed in the 19th century. In general, the ratio in n dimensions may be obtained from Cauchy's surface area formula, which is in turn a special case of Kubota's theorem. However, while these latter results are well-known to those working in integral geometry or the theory of convex bodies, the results are largely unknown to the physics community---so much so that even the 3-d result is sometimes said to have first been proven by an astronomer in the early 20th century! This is likely because the standard proofs in the mathematical literature are, by and large, couched in terms of concepts that are may not be familiar to many physicists. Therefore, in this work, we present a simple geometrical method of calculating the ratio of average projected area to surface area for convex bodies in arbitrary dimensions. We focus on a pedagogical, physically intuitive treatment that it is hoped will be useful to those in the physics community. We do discuss the mathematical background of the theorem as well, pointing those who may be interested to sources that offer the proofs that are standard in the fields of integral geometry and the theory of convex bodies. We also provide discussion of the applications of the theorem, especially noting that higher-dimensional ratios may be of use for constructing observational tests of string theory. Finally, we examine the limiting behavior of the ratio with the goal of offering intuition on its behavior by pointing out a suggestive connection with a well-known fact in statistics.Comment: 12 pages, 3 figures, submitted JGP after addition of discussion of previous work on this topi

    Generalized Weiszfeld algorithms for Lq optimization

    Get PDF
    In many computer vision applications, a desired model of some type is computed by minimizing a cost function based on several measurements. Typically, one may compute the model that minimizes the L₂ cost, that is the sum of squares of measurement errors with respect to the model. However, the Lq solution which minimizes the sum of the qth power of errors usually gives more robust results in the presence of outliers for some values of q, for example, q = 1. The Weiszfeld algorithm is a classic algorithm for finding the geometric L1 mean of a set of points in Euclidean space. It is provably optimal and requires neither differentiation, nor line search. The Weiszfeld algorithm has also been generalized to find the L1 mean of a set of points on a Riemannian manifold of non-negative curvature. This paper shows that the Weiszfeld approach may be extended to a wide variety of problems to find an Lq mean for 1 ≤ q <; 2, while maintaining simplicity and provable convergence. We apply this problem to both single-rotation averaging (under which the algorithm provably finds the global Lq optimum) and multiple rotation averaging (for which no such proof exists). Experimental results of Lq optimization for rotations show the improved reliability and robustness compared to L₂ optimization.This research has been funded by National ICT Australia

    Estimating Vessel Efficiency Using a Bootstrapped Data Envelopment Analysis Model

    Get PDF
    Technical efficiency, which measures how well a firm transforms inputs into outputs, gives fishery managers important information concerning the economic status of the fishing fleet and how regulations may be impacting vessel profitability. Data envelopment analysis (DEA), and the stochastic production frontier (SPF) have emerged as preferred methods to estimate efficiency in fisheries. Although each of the approaches has strengths and weaknesses, DEA has often been criticized because it is "deterministic" and fails to account for noise in the data. This paper presents a method for examining the underlying statistical structure of DEA models using bootstrap methods and readily available software. The approach is then applied to a case study of the U.S. mid-Atlantic sea scallop dredge fleet. Results show that the 95% confidence interval for technically efficient output is well above the maximum sustained yield (MSY) level of output.Bootstrap methods, data envelopment analysis, technical efficiency., Research Methods/ Statistical Methods, C44, Q22,

    Blaschke, Separation Theorems and some Topological Properties for Orthogonally Convex Sets

    Full text link
    In this paper, we deal with analytic and geometric properties of orthogonally convex sets. We establish a Blaschke-type theorem for path-connected and orthogonally convex sets in the plane using orthogonally convex paths. The separation of these sets is established using suitable grids. Consequently, a closed and orthogonally convex set is represented by the intersection of staircase-halfplanes in the plane. Some topological properties of orthogonally convex sets in dimensional spaces are also given.Comment: 17 pages, 10 figures, adding more reference
    corecore