23,833 research outputs found
A finite element approach for vector- and tensor-valued surface PDEs
We derive a Cartesian componentwise description of the covariant derivative
of tangential tensor fields of any degree on general manifolds. This allows to
reformulate any vector- and tensor-valued surface PDE in a form suitable to be
solved by established tools for scalar-valued surface PDEs. We consider
piecewise linear Lagrange surface finite elements on triangulated surfaces and
validate the approach by a vector- and a tensor-valued surface Helmholtz
problem on an ellipsoid. We experimentally show optimal (linear) order of
convergence for these problems. The full functionality is demonstrated by
solving a surface Landau-de Gennes problem on the Stanford bunny. All tools
required to apply this approach to other vector- and tensor-valued surface PDEs
are provided
Shape Calculus for Shape Energies in Image Processing
Many image processing problems are naturally expressed as energy minimization
or shape optimization problems, in which the free variable is a shape, such as
a curve in 2d or a surface in 3d. Examples are image segmentation, multiview
stereo reconstruction, geometric interpolation from data point clouds. To
obtain the solution of such a problem, one usually resorts to an iterative
approach, a gradient descent algorithm, which updates a candidate shape
gradually deforming it into the optimal shape. Computing the gradient descent
updates requires the knowledge of the first variation of the shape energy, or
rather the first shape derivative. In addition to the first shape derivative,
one can also utilize the second shape derivative and develop a Newton-type
method with faster convergence. Unfortunately, the knowledge of shape
derivatives for shape energies in image processing is patchy. The second shape
derivatives are known for only two of the energies in the image processing
literature and many results for the first shape derivative are limiting, in the
sense that they are either for curves on planes, or developed for a specific
representation of the shape or for a very specific functional form in the shape
energy. In this work, these limitations are overcome and the first and second
shape derivatives are computed for large classes of shape energies that are
representative of the energies found in image processing. Many of the formulas
we obtain are new and some generalize previous existing results. These results
are valid for general surfaces in any number of dimensions. This work is
intended to serve as a cookbook for researchers who deal with shape energies
for various applications in image processing and need to develop algorithms to
compute the shapes minimizing these energies
Discrete exterior calculus (DEC) for the surface Navier-Stokes equation
We consider a numerical approach for the incompressible surface Navier-Stokes
equation. The approach is based on the covariant form and uses discrete
exterior calculus (DEC) in space and a semi-implicit discretization in time.
The discretization is described in detail and related to finite difference
schemes on staggered grids in flat space for which we demonstrate second order
convergence. We compare computational results with a vorticity-stream function
approach for surfaces with genus 0 and demonstrate the interplay between
topology, geometry and flow properties. Our discretization also allows to
handle harmonic vector fields, which we demonstrate on a torus.Comment: 21 pages, 9 figure
Geometric deep learning: going beyond Euclidean data
Many scientific fields study data with an underlying structure that is a
non-Euclidean space. Some examples include social networks in computational
social sciences, sensor networks in communications, functional networks in
brain imaging, regulatory networks in genetics, and meshed surfaces in computer
graphics. In many applications, such geometric data are large and complex (in
the case of social networks, on the scale of billions), and are natural targets
for machine learning techniques. In particular, we would like to use deep
neural networks, which have recently proven to be powerful tools for a broad
range of problems from computer vision, natural language processing, and audio
analysis. However, these tools have been most successful on data with an
underlying Euclidean or grid-like structure, and in cases where the invariances
of these structures are built into networks used to model them. Geometric deep
learning is an umbrella term for emerging techniques attempting to generalize
(structured) deep neural models to non-Euclidean domains such as graphs and
manifolds. The purpose of this paper is to overview different examples of
geometric deep learning problems and present available solutions, key
difficulties, applications, and future research directions in this nascent
field
- …