2,030 research outputs found
The matching problem between functional shapes via a BV-penalty term: a -convergence result
In this paper we study a variant of the matching model between functional
shapes introduced in \cite{ABN}. Such a model allows to compare surfaces
equipped with a signal and the matching energy is defined by the -norm of
the signal on the surface and a varifold-type attachment term.
In this work we study the problem with fixed geometry which means that we
optimize the initial signal (supported on the initial surface) with respect to
a target signal supported on a different surface. In particular, we consider a
or -penalty for the signal instead of its -norm. Several
numerical examples are shown in order to prove that the -penalty improves
the quality of the matching. Moreover, we prove a -convergence result
for the discrete matching energy towards the continuous-one
Convergence of the -Means Minimization Problem using -Convergence
The -means method is an iterative clustering algorithm which associates
each observation with one of clusters. It traditionally employs cluster
centers in the same space as the observed data. By relaxing this requirement,
it is possible to apply the -means method to infinite dimensional problems,
for example multiple target tracking and smoothing problems in the presence of
unknown data association. Via a -convergence argument, the associated
optimization problem is shown to converge in the sense that both the -means
minimum and minimizers converge in the large data limit to quantities which
depend upon the observed data only through its distribution. The theory is
supplemented with two examples to demonstrate the range of problems now
accessible by the -means method. The first example combines a non-parametric
smoothing problem with unknown data association. The second addresses tracking
using sparse data from a network of passive sensors
Asymptotic behavior of gradient-like dynamical systems involving inertia and multiscale aspects
In a Hilbert space , we study the asymptotic behaviour, as time
variable goes to , of nonautonomous gradient-like dynamical
systems involving inertia and multiscale features.
Given a general Hilbert space, and two convex
differentiable functions, a positive damping parameter, and a function of which tends to zero as goes to , we
consider the second-order differential equation This
system models the emergence of various collective behaviors in game theory, as
well as the asymptotic control of coupled nonlinear oscillators. Assuming that
tends to zero moderately slowly as goes to infinity, we show
that the trajectories converge weakly in . The limiting equilibria
are solutions of the hierarchical minimization problem which consists in
minimizing over the set of minimizers of . As key assumptions,
we suppose that and that, for
every belonging to a convex cone depending on the data
and where is
the Fenchel conjugate of , and is the support function of
. An application is given to coupled oscillators
lp-Recovery of the Most Significant Subspace among Multiple Subspaces with Outliers
We assume data sampled from a mixture of d-dimensional linear subspaces with
spherically symmetric distributions within each subspace and an additional
outlier component with spherically symmetric distribution within the ambient
space (for simplicity we may assume that all distributions are uniform on their
corresponding unit spheres). We also assume mixture weights for the different
components. We say that one of the underlying subspaces of the model is most
significant if its mixture weight is higher than the sum of the mixture weights
of all other subspaces. We study the recovery of the most significant subspace
by minimizing the lp-averaged distances of data points from d-dimensional
subspaces, where p>0. Unlike other lp minimization problems, this minimization
is non-convex for all p>0 and thus requires different methods for its analysis.
We show that if 0<p<=1, then for any fraction of outliers the most significant
subspace can be recovered by lp minimization with overwhelming probability
(which depends on the generating distribution and its parameters). We show that
when adding small noise around the underlying subspaces the most significant
subspace can be nearly recovered by lp minimization for any 0<p<=1 with an
error proportional to the noise level. On the other hand, if p>1 and there is
more than one underlying subspace, then with overwhelming probability the most
significant subspace cannot be recovered or nearly recovered. This last result
does not require spherically symmetric outliers.Comment: This is a revised version of the part of 1002.1994 that deals with
single subspace recovery. V3: Improved estimates (in particular for Lemma 3.1
and for estimates relying on it), asymptotic dependence of probabilities and
constants on D and d and further clarifications; for simplicity it assumes
uniform distributions on spheres. V4: minor revision for the published
versio
- …