950 research outputs found
A probabilistic approach to reducing the algebraic complexity of computing Delaunay triangulations
Computing Delaunay triangulations in involves evaluating the
so-called in\_sphere predicate that determines if a point lies inside, on
or outside the sphere circumscribing points . This
predicate reduces to evaluating the sign of a multivariate polynomial of degree
in the coordinates of the points . Despite
much progress on exact geometric computing, the fact that the degree of the
polynomial increases with makes the evaluation of the sign of such a
polynomial problematic except in very low dimensions. In this paper, we propose
a new approach that is based on the witness complex, a weak form of the
Delaunay complex introduced by Carlsson and de Silva. The witness complex
is defined from two sets and in some metric space
: a finite set of points on which the complex is built, and a set of
witnesses that serves as an approximation of . A fundamental result of de
Silva states that if .
In this paper, we give conditions on that ensure that the witness complex
and the Delaunay triangulation coincide when is a finite set, and we
introduce a new perturbation scheme to compute a perturbed set close to
such that . Our perturbation
algorithm is a geometric application of the Moser-Tardos constructive proof of
the Lov\'asz local lemma. The only numerical operations we use are (squared)
distance comparisons (i.e., predicates of degree 2). The time-complexity of the
algorithm is sublinear in . Interestingly, although the algorithm does not
compute any measure of simplex quality, a lower bound on the thickness of the
output simplices can be guaranteed.Comment: 24 page
Discrete Geometric Structures in Homogenization and Inverse Homogenization with application to EIT
We introduce a new geometric approach for the homogenization and inverse
homogenization of the divergence form elliptic operator with rough conductivity
coefficients in dimension two. We show that conductivity
coefficients are in one-to-one correspondence with divergence-free matrices and
convex functions over the domain . Although homogenization is a
non-linear and non-injective operator when applied directly to conductivity
coefficients, homogenization becomes a linear interpolation operator over
triangulations of when re-expressed using convex functions, and is a
volume averaging operator when re-expressed with divergence-free matrices.
Using optimal weighted Delaunay triangulations for linearly interpolating
convex functions, we obtain an optimally robust homogenization algorithm for
arbitrary rough coefficients. Next, we consider inverse homogenization and show
how to decompose it into a linear ill-posed problem and a well-posed non-linear
problem. We apply this new geometric approach to Electrical Impedance
Tomography (EIT). It is known that the EIT problem admits at most one isotropic
solution. If an isotropic solution exists, we show how to compute it from any
conductivity having the same boundary Dirichlet-to-Neumann map. It is known
that the EIT problem admits a unique (stable with respect to -convergence)
solution in the space of divergence-free matrices. As such we suggest that the
space of convex functions is the natural space in which to parameterize
solutions of the EIT problem
Adaptive, Anisotropic and Hierarchical cones of Discrete Convex functions
We address the discretization of optimization problems posed on the cone of
convex functions, motivated in particular by the principal agent problem in
economics, which models the impact of monopoly on product quality. Consider a
two dimensional domain, sampled on a grid of N points. We show that the cone of
restrictions to the grid of convex functions is in general characterized by N^2
linear inequalities; a direct computational use of this description therefore
has a prohibitive complexity. We thus introduce a hierarchy of sub-cones of
discrete convex functions, associated to stencils which can be adaptively,
locally, and anisotropically refined. Numerical experiments optimize the
accuracy/complexity tradeoff through the use of a-posteriori stencil refinement
strategies.Comment: 35 pages, 11 figures. (Second version fixes a small bug in Lemma 3.2.
Modifications are anecdotic.
VoroCrust: Voronoi Meshing Without Clipping
Polyhedral meshes are increasingly becoming an attractive option with
particular advantages over traditional meshes for certain applications. What
has been missing is a robust polyhedral meshing algorithm that can handle broad
classes of domains exhibiting arbitrarily curved boundaries and sharp features.
In addition, the power of primal-dual mesh pairs, exemplified by
Voronoi-Delaunay meshes, has been recognized as an important ingredient in
numerous formulations. The VoroCrust algorithm is the first provably-correct
algorithm for conforming polyhedral Voronoi meshing for non-convex and
non-manifold domains with guarantees on the quality of both surface and volume
elements. A robust refinement process estimates a suitable sizing field that
enables the careful placement of Voronoi seeds across the surface circumventing
the need for clipping and avoiding its many drawbacks. The algorithm has the
flexibility of filling the interior by either structured or random samples,
while preserving all sharp features in the output mesh. We demonstrate the
capabilities of the algorithm on a variety of models and compare against
state-of-the-art polyhedral meshing methods based on clipped Voronoi cells
establishing the clear advantage of VoroCrust output.Comment: 18 pages (including appendix), 18 figures. Version without compressed
images available on https://www.dropbox.com/s/qc6sot1gaujundy/VoroCrust.pdf.
Supplemental materials available on
https://www.dropbox.com/s/6p72h1e2ivw6kj3/VoroCrust_supplemental_materials.pd
- ā¦