358 research outputs found
Reconstruction of freeform surfaces for metrology
The application of freeform surfaces has increased since their complex shapes closely express a product's functional specifications and their machining is obtained with higher accuracy. In particular, optical surfaces exhibit enhanced performance especially when they take aspheric forms or more complex forms with multi-undulations. This study is mainly focused on the reconstruction of complex shapes such as freeform optical surfaces, and on the characterization of their form. The computer graphics community has proposed various algorithms for constructing a mesh based on the cloud of sample points. The mesh is a piecewise linear approximation of the surface and an interpolation of the point set. The mesh can further be processed for fitting parametric surfaces (Polyworks® or Geomagic®). The metrology community investigates direct fitting approaches. If the surface mathematical model is given, fitting is a straight forward task. Nonetheless, if the surface model is unknown, fitting is only possible through the association of polynomial Spline parametric surfaces. In this paper, a comparative study carried out on methods proposed by the computer graphics community will be presented to elucidate the advantages of these approaches. We stress the importance of the pre-processing phase as well as the significance of initial conditions. We further emphasize the importance of the meshing phase by stating that a proper mesh has two major advantages. First, it organizes the initially unstructured point set and it provides an insight of orientation, neighbourhood and curvature, and infers information on both its geometry and topology. Second, it conveys a better segmentation of the space, leading to a correct patching and association of parametric surfaces.EMR
Topological Data Analysis with Bregman Divergences
Given a finite set in a metric space, the topological analysis generalizes
hierarchical clustering using a 1-parameter family of homology groups to
quantify connectivity in all dimensions. The connectivity is compactly
described by the persistence diagram. One limitation of the current framework
is the reliance on metric distances, whereas in many practical applications
objects are compared by non-metric dissimilarity measures. Examples are the
Kullback-Leibler divergence, which is commonly used for comparing text and
images, and the Itakura-Saito divergence, popular for speech and sound. These
are two members of the broad family of dissimilarities called Bregman
divergences.
We show that the framework of topological data analysis can be extended to
general Bregman divergences, widening the scope of possible applications. In
particular, we prove that appropriately generalized Cech and Delaunay (alpha)
complexes capture the correct homotopy type, namely that of the corresponding
union of Bregman balls. Consequently, their filtrations give the correct
persistence diagram, namely the one generated by the uniformly growing Bregman
balls. Moreover, we show that unlike the metric setting, the filtration of
Vietoris-Rips complexes may fail to approximate the persistence diagram. We
propose algorithms to compute the thus generalized Cech, Vietoris-Rips and
Delaunay complexes and experimentally test their efficiency. Lastly, we explain
their surprisingly good performance by making a connection with discrete Morse
theory
The Topology ToolKit
This system paper presents the Topology ToolKit (TTK), a software platform
designed for topological data analysis in scientific visualization. TTK
provides a unified, generic, efficient, and robust implementation of key
algorithms for the topological analysis of scalar data, including: critical
points, integral lines, persistence diagrams, persistence curves, merge trees,
contour trees, Morse-Smale complexes, fiber surfaces, continuous scatterplots,
Jacobi sets, Reeb spaces, and more. TTK is easily accessible to end users due
to a tight integration with ParaView. It is also easily accessible to
developers through a variety of bindings (Python, VTK/C++) for fast prototyping
or through direct, dependence-free, C++, to ease integration into pre-existing
complex systems. While developing TTK, we faced several algorithmic and
software engineering challenges, which we document in this paper. In
particular, we present an algorithm for the construction of a discrete gradient
that complies to the critical points extracted in the piecewise-linear setting.
This algorithm guarantees a combinatorial consistency across the topological
abstractions supported by TTK, and importantly, a unified implementation of
topological data simplification for multi-scale exploration and analysis. We
also present a cached triangulation data structure, that supports time
efficient and generic traversals, which self-adjusts its memory usage on demand
for input simplicial meshes and which implicitly emulates a triangulation for
regular grids with no memory overhead. Finally, we describe an original
software architecture, which guarantees memory efficient and direct accesses to
TTK features, while still allowing for researchers powerful and easy bindings
and extensions. TTK is open source (BSD license) and its code, online
documentation and video tutorials are available on TTK's website
Curve counting, instantons and McKay correspondences
We survey some features of equivariant instanton partition functions of
topological gauge theories on four and six dimensional toric Kahler varieties,
and their geometric and algebraic counterparts in the enumerative problem of
counting holomorphic curves. We discuss the relations of instanton counting to
representations of affine Lie algebras in the four-dimensional case, and to
Donaldson-Thomas theory for ideal sheaves on Calabi-Yau threefolds. For
resolutions of toric singularities, an algebraic structure induced by a quiver
determines the instanton moduli space through the McKay correspondence and its
generalizations. The correspondence elucidates the realization of gauge theory
partition functions as quasi-modular forms, and reformulates the computation of
noncommutative Donaldson-Thomas invariants in terms of the enumeration of
generalized instantons. New results include a general presentation of the
partition functions on ALE spaces as affine characters, a rigorous treatment of
equivariant partition functions on Hirzebruch surfaces, and a putative
connection between the special McKay correspondence and instanton counting on
Hirzebruch-Jung spaces.Comment: 79 pages, 3 figures; v2: typos corrected, reference added, new
summary section included; Final version to appear in Journal of Geometry and
Physic
Surface reconstruction by computing restricted Voronoi cells in parallel
International audienceWe present a method for reconstructing a 3D surface triangulation from an input point set. The main component of the method is an algorithm that computes the restricted Voronoi diagram. In our specific case, it corresponds to the intersection between the 3D Voronoi diagram of the input points and a set of disks centered at the points and orthogonal to the estimated normal directions. The method does not require coherent normal orientations (just directions). Our algorithm is based on a property of the restricted Voronoi cells that leads to an embarrassingly parallel implementation. We experimented our algorithm with scanned point sets with up to 100 million vertices that were processed within few minutes on a standard computer. The complete implementation is provided
- …