66,504 research outputs found
On the General Chain Pair Simplification Problem
The Chain Pair Simplification problem (CPS) was posed by Bereg et al. who were motivated by the problem of efficiently computing and visualizing the structural resemblance between a pair of protein backbones. In this problem, given two polygonal chains of lengths n and m, the goal is to simplify both of them simultaneously, so that the lengths of the resulting simplifications as well as the discrete Frechet distance between them are bounded. When the vertices of the simplifications are arbitrary (i.e., not necessarily from the original chains), the problem is called General CPS (GCPS).
In this paper we consider for the first time the complexity of GCPS under both the discrete Frechet distance (GCPS-3F) and the Hausdorff distance (GCPS-2H). (In the former version, the quality of the two simplifications is measured by the discrete Fr\u27echet distance, and in the latter version it is measured by the Hausdorff distance.) We prove that GCPS-3F is polynomially solvable, by presenting an widetilde-O((n+m)^6 min{n,m}) time algorithm for the corresponding minimization problem. We also present an O((n+m)^4) 2-approximation algorithm for the problem. On the other hand, we show that GCPS-2H is NP-complete, and present an approximation algorithm for the problem
SAT Solving for Argument Filterings
This paper introduces a propositional encoding for lexicographic path orders
in connection with dependency pairs. This facilitates the application of SAT
solvers for termination analysis of term rewrite systems based on the
dependency pair method. We address two main inter-related issues and encode
them as satisfiability problems of propositional formulas that can be
efficiently handled by SAT solving: (1) the combined search for a lexicographic
path order together with an \emph{argument filtering} to orient a set of
inequalities; and (2) how the choice of the argument filtering influences the
set of inequalities that have to be oriented. We have implemented our
contributions in the termination prover AProVE. Extensive experiments show that
by our encoding and the application of SAT solvers one obtains speedups in
orders of magnitude as well as increased termination proving power
Context unification is in PSPACE
Contexts are terms with one `hole', i.e. a place in which we can substitute
an argument. In context unification we are given an equation over terms with
variables representing contexts and ask about the satisfiability of this
equation. Context unification is a natural subvariant of second-order
unification, which is undecidable, and a generalization of word equations,
which are decidable, at the same time. It is the unique problem between those
two whose decidability is uncertain (for already almost two decades). In this
paper we show that the context unification is in PSPACE. The result holds under
a (usual) assumption that the first-order signature is finite.
This result is obtained by an extension of the recompression technique,
recently developed by the author and used in particular to obtain a new PSPACE
algorithm for satisfiability of word equations, to context unification. The
recompression is based on performing simple compression rules (replacing pairs
of neighbouring function symbols), which are (conceptually) applied on the
solution of the context equation and modifying the equation in a way so that
such compression steps can be in fact performed directly on the equation,
without the knowledge of the actual solution.Comment: 27 pages, submitted, small notation changes and small improvements
over the previous tex
On Embeddability of Buses in Point Sets
Set membership of points in the plane can be visualized by connecting
corresponding points via graphical features, like paths, trees, polygons,
ellipses. In this paper we study the \emph{bus embeddability problem} (BEP):
given a set of colored points we ask whether there exists a planar realization
with one horizontal straight-line segment per color, called bus, such that all
points with the same color are connected with vertical line segments to their
bus. We present an ILP and an FPT algorithm for the general problem. For
restricted versions of this problem, such as when the relative order of buses
is predefined, or when a bus must be placed above all its points, we provide
efficient algorithms. We show that another restricted version of the problem
can be solved using 2-stack pushall sorting. On the negative side we prove the
NP-completeness of a special case of BEP.Comment: 19 pages, 9 figures, conference version at GD 201
The persistent cosmic web and its filamentary structure I: Theory and implementation
We present DisPerSE, a novel approach to the coherent multi-scale
identification of all types of astrophysical structures, and in particular the
filaments, in the large scale distribution of matter in the Universe. This
method and corresponding piece of software allows a genuinely scale free and
parameter free identification of the voids, walls, filaments, clusters and
their configuration within the cosmic web, directly from the discrete
distribution of particles in N-body simulations or galaxies in sparse
observational catalogues. To achieve that goal, the method works directly over
the Delaunay tessellation of the discrete sample and uses the DTFE density
computed at each tracer particle; no further sampling, smoothing or processing
of the density field is required.
The idea is based on recent advances in distinct sub-domains of computational
topology, which allows a rigorous application of topological principles to
astrophysical data sets, taking into account uncertainties and Poisson noise.
Practically, the user can define a given persistence level in terms of
robustness with respect to noise (defined as a "number of sigmas") and the
algorithm returns the structures with the corresponding significance as sets of
critical points, lines, surfaces and volumes corresponding to the clusters,
filaments, walls and voids; filaments, connected at cluster nodes, crawling
along the edges of walls bounding the voids. The method is also interesting as
it allows for a robust quantification of the topological properties of a
discrete distribution in terms of Betti numbers or Euler characteristics,
without having to resort to smoothing or having to define a particular scale.
In this paper, we introduce the necessary mathematical background and
describe the method and implementation, while we address the application to 3D
simulated and observed data sets to the companion paper.Comment: A higher resolution version is available at
http://www.iap.fr/users/sousbie together with complementary material.
Submitted to MNRA
A Logical Approach to Efficient Max-SAT solving
Weighted Max-SAT is the optimization version of SAT and many important
problems can be naturally encoded as such. Solving weighted Max-SAT is an
important problem from both a theoretical and a practical point of view. In
recent years, there has been considerable interest in finding efficient solving
techniques. Most of this work focus on the computation of good quality lower
bounds to be used within a branch and bound DPLL-like algorithm. Most often,
these lower bounds are described in a procedural way. Because of that, it is
difficult to realize the {\em logic} that is behind.
In this paper we introduce an original framework for Max-SAT that stresses
the parallelism with classical SAT. Then, we extend the two basic SAT solving
techniques: {\em search} and {\em inference}. We show that many algorithmic
{\em tricks} used in state-of-the-art Max-SAT solvers are easily expressable in
{\em logic} terms with our framework in a unified manner.
Besides, we introduce an original search algorithm that performs a restricted
amount of {\em weighted resolution} at each visited node. We empirically
compare our algorithm with a variety of solving alternatives on several
benchmarks. Our experiments, which constitute to the best of our knowledge the
most comprehensive Max-sat evaluation ever reported, show that our algorithm is
generally orders of magnitude faster than any competitor
- …