11,459 research outputs found
Combinatorial and Asymptotical Results on the Neighborhood Grid
In 2009, Joselli et al introduced the Neighborhood Grid data structure for
fast computation of neighborhood estimates in point clouds. Even though the
data structure has been used in several applications and shown to be
practically relevant, it is theoretically not yet well understood. The purpose
of this paper is to present a polynomial-time algorithm to build the data
structure. Furthermore, it is investigated whether the presented algorithm is
optimal. This investigations leads to several combinatorial questions for which
partial results are given. Finally, we present several limits and experiments
regarding the quality of the obtained neighborhood relation.Comment: 33 pages, 18 Figure
First order algorithms in variational image processing
Variational methods in imaging are nowadays developing towards a quite
universal and flexible tool, allowing for highly successful approaches on tasks
like denoising, deblurring, inpainting, segmentation, super-resolution,
disparity, and optical flow estimation. The overall structure of such
approaches is of the form ; where the functional is a data fidelity term also
depending on some input data and measuring the deviation of from such
and is a regularization functional. Moreover is a (often linear)
forward operator modeling the dependence of data on an underlying image, and
is a positive regularization parameter. While is often
smooth and (strictly) convex, the current practice almost exclusively uses
nonsmooth regularization functionals. The majority of successful techniques is
using nonsmooth and convex functionals like the total variation and
generalizations thereof or -norms of coefficients arising from scalar
products with some frame system. The efficient solution of such variational
problems in imaging demands for appropriate algorithms. Taking into account the
specific structure as a sum of two very different terms to be minimized,
splitting algorithms are a quite canonical choice. Consequently this field has
revived the interest in techniques like operator splittings or augmented
Lagrangians. Here we shall provide an overview of methods currently developed
and recent results as well as some computational studies providing a comparison
of different methods and also illustrating their success in applications.Comment: 60 pages, 33 figure
Cosmological constraints from multiple tracers in spectroscopic surveys
We use the Fisher matrix formalism to study the expansion and growth history
of the Universe using galaxy clustering with 2D angular cross-correlation
tomography in spectroscopic or high resolution photometric redshift surveys.
The radial information is contained in the cross correlations between narrow
redshift bins. We show how multiple tracers with redshift space distortions
cancel sample variance and arbitrarily improve the constraints on the dark
energy equation of state and the growth parameter in the
noiseless limit. The improvement for multiple tracers quickly increases with
the bias difference between the tracers, up to a factor in
. We model a magnitude limited survey with realistic
density and bias using a conditional luminosity function, finding a factor
1.3-9.0 improvement in -- depending on global
density -- with a split in a halo mass proxy. Partly overlapping redshift bins
improve the constraints in multiple tracer surveys a factor in
. This findings also apply to photometric surveys,
where the effect of using multiple tracers is magnified. We also show large
improvement on the FoM with increasing density, which could be used as a
trade-off to compensate some possible loss with radial resolution.Comment: 20 pages, 15 figure
- …