14,450 research outputs found
Scalable Kernelization for Maximum Independent Sets
The most efficient algorithms for finding maximum independent sets in both
theory and practice use reduction rules to obtain a much smaller problem
instance called a kernel. The kernel can then be solved quickly using exact or
heuristic algorithms---or by repeatedly kernelizing recursively in the
branch-and-reduce paradigm. It is of critical importance for these algorithms
that kernelization is fast and returns a small kernel. Current algorithms are
either slow but produce a small kernel, or fast and give a large kernel. We
attempt to accomplish both of these goals simultaneously, by giving an
efficient parallel kernelization algorithm based on graph partitioning and
parallel bipartite maximum matching. We combine our parallelization techniques
with two techniques to accelerate kernelization further: dependency checking
that prunes reductions that cannot be applied, and reduction tracking that
allows us to stop kernelization when reductions become less fruitful. Our
algorithm produces kernels that are orders of magnitude smaller than the
fastest kernelization methods, while having a similar execution time.
Furthermore, our algorithm is able to compute kernels with size comparable to
the smallest known kernels, but up to two orders of magnitude faster than
previously possible. Finally, we show that our kernelization algorithm can be
used to accelerate existing state-of-the-art heuristic algorithms, allowing us
to find larger independent sets faster on large real-world networks and
synthetic instances.Comment: Extended versio
Numerical Approaches for Linear Left-invariant Diffusions on SE(2), their Comparison to Exact Solutions, and their Applications in Retinal Imaging
Left-invariant PDE-evolutions on the roto-translation group (and
their resolvent equations) have been widely studied in the fields of cortical
modeling and image analysis. They include hypo-elliptic diffusion (for contour
enhancement) proposed by Citti & Sarti, and Petitot, and they include the
direction process (for contour completion) proposed by Mumford. This paper
presents a thorough study and comparison of the many numerical approaches,
which, remarkably, is missing in the literature. Existing numerical approaches
can be classified into 3 categories: Finite difference methods, Fourier based
methods (equivalent to -Fourier methods), and stochastic methods (Monte
Carlo simulations). There are also 3 types of exact solutions to the
PDE-evolutions that were derived explicitly (in the spatial Fourier domain) in
previous works by Duits and van Almsick in 2005. Here we provide an overview of
these 3 types of exact solutions and explain how they relate to each of the 3
numerical approaches. We compute relative errors of all numerical approaches to
the exact solutions, and the Fourier based methods show us the best performance
with smallest relative errors. We also provide an improvement of Mathematica
algorithms for evaluating Mathieu-functions, crucial in implementations of the
exact solutions. Furthermore, we include an asymptotical analysis of the
singularities within the kernels and we propose a probabilistic extension of
underlying stochastic processes that overcomes the singular behavior in the
origin of time-integrated kernels. Finally, we show retinal imaging
applications of combining left-invariant PDE-evolutions with invertible
orientation scores.Comment: A final and corrected version of the manuscript is Published in
Numerical Mathematics: Theory, Methods and Applications (NM-TMA), vol. (9),
p.1-50, 201
Langevin and Hamiltonian based Sequential MCMC for Efficient Bayesian Filtering in High-dimensional Spaces
Nonlinear non-Gaussian state-space models arise in numerous applications in
statistics and signal processing. In this context, one of the most successful
and popular approximation techniques is the Sequential Monte Carlo (SMC)
algorithm, also known as particle filtering. Nevertheless, this method tends to
be inefficient when applied to high dimensional problems. In this paper, we
focus on another class of sequential inference methods, namely the Sequential
Markov Chain Monte Carlo (SMCMC) techniques, which represent a promising
alternative to SMC methods. After providing a unifying framework for the class
of SMCMC approaches, we propose novel efficient strategies based on the
principle of Langevin diffusion and Hamiltonian dynamics in order to cope with
the increasing number of high-dimensional applications. Simulation results show
that the proposed algorithms achieve significantly better performance compared
to existing algorithms
- …