778 research outputs found
Sampling and Reconstruction of Shapes with Algebraic Boundaries
We present a sampling theory for a class of binary images with finite rate of
innovation (FRI). Every image in our model is the restriction of
\mathds{1}_{\{p\leq0\}} to the image plane, where \mathds{1} denotes the
indicator function and is some real bivariate polynomial. This particularly
means that the boundaries in the image form a subset of an algebraic curve with
the implicit polynomial . We show that the image parameters --i.e., the
polynomial coefficients-- satisfy a set of linear annihilation equations with
the coefficients being the image moments. The inherent sensitivity of the
moments to noise makes the reconstruction process numerically unstable and
narrows the choice of the sampling kernels to polynomial reproducing kernels.
As a remedy to these problems, we replace conventional moments with more stable
\emph{generalized moments} that are adjusted to the given sampling kernel. The
benefits are threefold: (1) it relaxes the requirements on the sampling
kernels, (2) produces annihilation equations that are robust at numerical
precision, and (3) extends the results to images with unbounded boundaries. We
further reduce the sensitivity of the reconstruction process to noise by taking
into account the sign of the polynomial at certain points, and sequentially
enforcing measurement consistency. We consider various numerical experiments to
demonstrate the performance of our algorithm in reconstructing binary images,
including low to moderate noise levels and a range of realistic sampling
kernels.Comment: 12 pages, 14 figure
A multi-level algorithm for the solution of moment problems
We study numerical methods for the solution of general linear moment
problems, where the solution belongs to a family of nested subspaces of a
Hilbert space. Multi-level algorithms, based on the conjugate gradient method
and the Landweber--Richardson method are proposed that determine the "optimal"
reconstruction level a posteriori from quantities that arise during the
numerical calculations. As an important example we discuss the reconstruction
of band-limited signals from irregularly spaced noisy samples, when the actual
bandwidth of the signal is not available. Numerical examples show the
usefulness of the proposed algorithms
Exact and approximate Strang-Fix conditions to reconstruct signals with finite rate of innovation from samples taken with arbitrary kernels
In the last few years, several new methods have been developed for the sampling and
exact reconstruction of specific classes of non-bandlimited signals known as signals with finite rate of innovation (FRI). This is achieved by using adequate sampling kernels and
reconstruction schemes. An example of valid kernels, which we use throughout the thesis,
is given by the family of exponential reproducing functions. These satisfy the generalised
Strang-Fix conditions, which ensure that proper linear combinations of the kernel with its
shifted versions reproduce polynomials or exponentials exactly.
The first contribution of the thesis is to analyse the behaviour of these kernels in the
case of noisy measurements in order to provide clear guidelines on how to choose the exponential
reproducing kernel that leads to the most stable reconstruction when estimating
FRI signals from noisy samples. We then depart from the situation in which we can choose
the sampling kernel and develop a new strategy that is universal in that it works with any
kernel. We do so by noting that meeting the exact exponential reproduction condition is
too stringent a constraint. We thus allow for a controlled error in the reproduction formula
in order to use the exponential reproduction idea with arbitrary kernels and develop
a universal reconstruction method which is stable and robust to noise.
Numerical results validate the various contributions of the thesis and in particular show
that the approximate exponential reproduction strategy leads to more stable and accurate
reconstruction results than those obtained when using the exact recovery methods.Open Acces
Slepian functions and their use in signal estimation and spectral analysis
It is a well-known fact that mathematical functions that are timelimited (or
spacelimited) cannot be simultaneously bandlimited (in frequency). Yet the
finite precision of measurement and computation unavoidably bandlimits our
observation and modeling scientific data, and we often only have access to, or
are only interested in, a study area that is temporally or spatially bounded.
In the geosciences we may be interested in spectrally modeling a time series
defined only on a certain interval, or we may want to characterize a specific
geographical area observed using an effectively bandlimited measurement device.
It is clear that analyzing and representing scientific data of this kind will
be facilitated if a basis of functions can be found that are "spatiospectrally"
concentrated, i.e. "localized" in both domains at the same time. Here, we give
a theoretical overview of one particular approach to this "concentration"
problem, as originally proposed for time series by Slepian and coworkers, in
the 1960s. We show how this framework leads to practical algorithms and
statistically performant methods for the analysis of signals and their power
spectra in one and two dimensions, and on the surface of a sphere.Comment: Submitted to the Handbook of Geomathematics, edited by Willi Freeden,
Zuhair M. Nashed and Thomas Sonar, and to be published by Springer Verla
Recommended from our members
Learning Theory and Approximation
The main goal of this workshop – the third one of this type at the MFO – has been to blend mathematical results from statistical learning theory and approximation theory to strengthen both disciplines and use synergistic effects to work on current research questions. Learning theory aims at modeling unknown function relations and data structures from samples in an automatic manner. Approximation theory is naturally used for the advancement and closely connected to the further development of learning theory, in particular for the exploration of new useful algorithms, and for the theoretical understanding of existing methods. Conversely, the study of learning theory also gives rise to interesting theoretical problems for approximation theory such as the approximation and sparse representation of functions or the construction of rich kernel reproducing Hilbert spaces on general metric spaces. This workshop has concentrated on the following recent topics: Pitchfork bifurcation of dynamical systems arising from mathematical foundations of cell development; regularized kernel based learning in the Big Data situation; deep learning; convergence rates of learning and online learning algorithms; numerical refinement algorithms to learning; statistical robustness of regularized kernel based learning
- …