687 research outputs found
Least Squares Ranking on Graphs
Given a set of alternatives to be ranked, and some pairwise comparison data,
ranking is a least squares computation on a graph. The vertices are the
alternatives, and the edge values comprise the comparison data. The basic idea
is very simple and old: come up with values on vertices such that their
differences match the given edge data. Since an exact match will usually be
impossible, one settles for matching in a least squares sense. This formulation
was first described by Leake in 1976 for rankingfootball teams and appears as
an example in Professor Gilbert Strang's classic linear algebra textbook. If
one is willing to look into the residual a little further, then the problem
really comes alive, as shown effectively by the remarkable recent paper of
Jiang et al. With or without this twist, the humble least squares problem on
graphs has far-reaching connections with many current areas ofresearch. These
connections are to theoretical computer science (spectral graph theory, and
multilevel methods for graph Laplacian systems); numerical analysis (algebraic
multigrid, and finite element exterior calculus); other mathematics (Hodge
decomposition, and random clique complexes); and applications (arbitrage, and
ranking of sports teams). Not all of these connections are explored in this
paper, but many are. The underlying ideas are easy to explain, requiring only
the four fundamental subspaces from elementary linear algebra. One of our aims
is to explain these basic ideas and connections, to get researchers in many
fields interested in this topic. Another aim is to use our numerical
experiments for guidance on selecting methods and exposing the need for further
development.Comment: Added missing references, comparison of linear solvers overhauled,
conclusion section added, some new figures adde
The Topology ToolKit
This system paper presents the Topology ToolKit (TTK), a software platform
designed for topological data analysis in scientific visualization. TTK
provides a unified, generic, efficient, and robust implementation of key
algorithms for the topological analysis of scalar data, including: critical
points, integral lines, persistence diagrams, persistence curves, merge trees,
contour trees, Morse-Smale complexes, fiber surfaces, continuous scatterplots,
Jacobi sets, Reeb spaces, and more. TTK is easily accessible to end users due
to a tight integration with ParaView. It is also easily accessible to
developers through a variety of bindings (Python, VTK/C++) for fast prototyping
or through direct, dependence-free, C++, to ease integration into pre-existing
complex systems. While developing TTK, we faced several algorithmic and
software engineering challenges, which we document in this paper. In
particular, we present an algorithm for the construction of a discrete gradient
that complies to the critical points extracted in the piecewise-linear setting.
This algorithm guarantees a combinatorial consistency across the topological
abstractions supported by TTK, and importantly, a unified implementation of
topological data simplification for multi-scale exploration and analysis. We
also present a cached triangulation data structure, that supports time
efficient and generic traversals, which self-adjusts its memory usage on demand
for input simplicial meshes and which implicitly emulates a triangulation for
regular grids with no memory overhead. Finally, we describe an original
software architecture, which guarantees memory efficient and direct accesses to
TTK features, while still allowing for researchers powerful and easy bindings
and extensions. TTK is open source (BSD license) and its code, online
documentation and video tutorials are available on TTK's website
Discrete differential operators on polygonal meshes
Geometry processing of surface meshes relies heavily on the discretization of differential operators such as gradient, Laplacian, and covariant derivative. While a variety of discrete operators over triangulated meshes have been developed and used for decades, a similar construction over polygonal meshes remains far less explored despite the prevalence of non-simplicial surfaces in geometric design and engineering applications. This paper introduces a principled construction of discrete differential operators on surface meshes formed by (possibly non-flat and non-convex) polygonal faces. Our approach is based on a novel mimetic discretization of the gradient operator that is linear-precise on arbitrary polygons. Equipped with this discrete gradient, we draw upon ideas from the Virtual Element Method in order to derive a series of discrete operators commonly used in graphics that are now valid over polygonal surfaces. We demonstrate the accuracy and robustness of our resulting operators through various numerical examples, before incorporating them into existing geometry processing algorithms
A Posteriori Error Control for the Binary Mumford-Shah Model
The binary Mumford-Shah model is a widespread tool for image segmentation and
can be considered as a basic model in shape optimization with a broad range of
applications in computer vision, ranging from basic segmentation and labeling
to object reconstruction. This paper presents robust a posteriori error
estimates for a natural error quantity, namely the area of the non properly
segmented region. To this end, a suitable strictly convex and non-constrained
relaxation of the originally non-convex functional is investigated and Repin's
functional approach for a posteriori error estimation is used to control the
numerical error for the relaxed problem in the -norm. In combination with
a suitable cut out argument, a fully practical estimate for the area mismatch
is derived. This estimate is incorporated in an adaptive meshing strategy. Two
different adaptive primal-dual finite element schemes, and the most frequently
used finite difference discretization are investigated and compared. Numerical
experiments show qualitative and quantitative properties of the estimates and
demonstrate their usefulness in practical applications.Comment: 18 pages, 7 figures, 1 tabl
The VOLNA code for the numerical modelling of tsunami waves: generation, propagation and inundation
A novel tool for tsunami wave modelling is presented. This tool has the
potential of being used for operational purposes: indeed, the numerical code
\VOLNA is able to handle the complete life-cycle of a tsunami (generation,
propagation and run-up along the coast). The algorithm works on unstructured
triangular meshes and thus can be run in arbitrary complex domains. This paper
contains the detailed description of the finite volume scheme implemented in
the code. The numerical treatment of the wet/dry transition is explained. This
point is crucial for accurate run-up/run-down computations. Most existing
tsunami codes use semi-empirical techniques at this stage, which are not always
sufficient for tsunami hazard mitigation. Indeed the decision to evacuate
inhabitants is based on inundation maps which are produced with this type of
numerical tools. We present several realistic test cases that partially
validate our algorithm. Comparisons with analytical solutions and experimental
data are performed. Finally the main conclusions are outlined and the
perspectives for future research presented.Comment: 47 pages, 27 figures. Other author's papers can be downloaded at
http://www.lama.univ-savoie.fr/~dutykh
An exact general remeshing scheme applied to physically conservative voxelization
We present an exact general remeshing scheme to compute analytic integrals of
polynomial functions over the intersections between convex polyhedral cells of
old and new meshes. In physics applications this allows one to ensure global
mass, momentum, and energy conservation while applying higher-order polynomial
interpolation. We elaborate on applications of our algorithm arising in the
analysis of cosmological N-body data, computer graphics, and continuum
mechanics problems.
We focus on the particular case of remeshing tetrahedral cells onto a
Cartesian grid such that the volume integral of the polynomial density function
given on the input mesh is guaranteed to equal the corresponding integral over
the output mesh. We refer to this as "physically conservative voxelization".
At the core of our method is an algorithm for intersecting two convex
polyhedra by successively clipping one against the faces of the other. This
algorithm is an implementation of the ideas presented abstractly by Sugihara
(1994), who suggests using the planar graph representations of convex polyhedra
to ensure topological consistency of the output. This makes our implementation
robust to geometric degeneracy in the input. We employ a simplicial
decomposition to calculate moment integrals up to quadratic order over the
resulting intersection domain.
We also address practical issues arising in a software implementation,
including numerical stability in geometric calculations, management of
cancellation errors, and extension to two dimensions. In a comparison to recent
work, we show substantial performance gains. We provide a C implementation
intended to be a fast, accurate, and robust tool for geometric calculations on
polyhedral mesh elements.Comment: Code implementation available at https://github.com/devonmpowell/r3
Tetrahedral mesh improvement using moving mesh smoothing, lazy searching flips, and RBF surface reconstruction
Given a tetrahedral mesh and objective functionals measuring the mesh quality
which take into account the shape, size, and orientation of the mesh elements,
our aim is to improve the mesh quality as much as possible. In this paper, we
combine the moving mesh smoothing, based on the integration of an ordinary
differential equation coming from a given functional, with the lazy flip
technique, a reversible edge removal algorithm to modify the mesh connectivity.
Moreover, we utilize radial basis function (RBF) surface reconstruction to
improve tetrahedral meshes with curved boundary surfaces. Numerical tests show
that the combination of these techniques into a mesh improvement framework
achieves results which are comparable and even better than the previously
reported ones.Comment: Revised and improved versio
- …