18,645 research outputs found
Truth Table Invariant Cylindrical Algebraic Decomposition by Regular Chains
A new algorithm to compute cylindrical algebraic decompositions (CADs) is
presented, building on two recent advances. Firstly, the output is truth table
invariant (a TTICAD) meaning given formulae have constant truth value on each
cell of the decomposition. Secondly, the computation uses regular chains theory
to first build a cylindrical decomposition of complex space (CCD) incrementally
by polynomial. Significant modification of the regular chains technology was
used to achieve the more sophisticated invariance criteria. Experimental
results on an implementation in the RegularChains Library for Maple verify that
combining these advances gives an algorithm superior to its individual
components and competitive with the state of the art
Algorithmic Thomas Decomposition of Algebraic and Differential Systems
In this paper, we consider systems of algebraic and non-linear partial
differential equations and inequations. We decompose these systems into
so-called simple subsystems and thereby partition the set of solutions. For
algebraic systems, simplicity means triangularity, square-freeness and
non-vanishing initials. Differential simplicity extends algebraic simplicity
with involutivity. We build upon the constructive ideas of J. M. Thomas and
develop them into a new algorithm for disjoint decomposition. The given paper
is a revised version of a previous paper and includes the proofs of correctness
and termination of our decomposition algorithm. In addition, we illustrate the
algorithm with further instructive examples and describe its Maple
implementation together with an experimental comparison to some other
triangular decomposition algorithms.Comment: arXiv admin note: substantial text overlap with arXiv:1008.376
The complexity of the normal surface solution space
Normal surface theory is a central tool in algorithmic three-dimensional
topology, and the enumeration of vertex normal surfaces is the computational
bottleneck in many important algorithms. However, it is not well understood how
the number of such surfaces grows in relation to the size of the underlying
triangulation. Here we address this problem in both theory and practice. In
theory, we tighten the exponential upper bound substantially; furthermore, we
construct pathological triangulations that prove an exponential bound to be
unavoidable. In practice, we undertake a comprehensive analysis of millions of
triangulations and find that in general the number of vertex normal surfaces is
remarkably small, with strong evidence that our pathological triangulations may
in fact be the worst case scenarios. This analysis is the first of its kind,
and the striking behaviour that we observe has important implications for the
feasibility of topological algorithms in three dimensions.Comment: Extended abstract (i.e., conference-style), 14 pages, 8 figures, 2
tables; v2: added minor clarification
- …