66,440 research outputs found
Fast minimal triangulation algorithm using minimum degree criterion
AbstractWe propose an algorithm for minimal triangulation which, using simple and efficient strategy, subdivides the input graph in different, almost non-overlapping, subgraphs. Using the technique of matrix multiplication for saturating the minimal separators, we show that the partition of the graph can be computed in time O(nα) where nα is the time required by the binary matrix multiplication. After saturating the minimal separators, the same procedure is recursively applied on each subgraphs. We also present a variant of the algorithm in which the minimum degree criterion is used. In this way, we obtain an algorithm that uses minimum degree criterion and at the same time produces a minimal triangulation, thus shedding new light on the effectiveness of the minimum degree heuristics
Tree decompositions with small cost
The f-cost of a tree decomposition ({Xi | i e I}, T = (I;F))
for a function f : N -> R+ is defined as EieI f(|Xi|). This measure
associates with the running time or memory use of some algorithms
that use the tree decomposition. In this paper we investigate the
problem to find tree decompositions of minimum f-cost.
A function f : N -> R+ is fast, if for every i e N: f(i+1) => 2*f(i).
We show that for fast functions f, every graph G has a tree decomposition
of minimum f-cost that corresponds to a minimal triangulation
of G; if f is not fast, this does not hold. We give polynomial time
algorithms for the problem, assuming f is a fast function, for graphs
that has a polynomial number of minimal separators, for graphs of
treewidth at most two, and for cographs, and show that the problem
is NP-hard for bipartite graphs and for cobipartite graphs.
We also discuss results for a weighted variant of the problem derived
of an application from probabilistic networks
Computing hypergraph width measures exactly
Hypergraph width measures are a class of hypergraph invariants important in
studying the complexity of constraint satisfaction problems (CSPs). We present
a general exact exponential algorithm for a large variety of these measures. A
connection between these and tree decompositions is established. This enables
us to almost seamlessly adapt the combinatorial and algorithmic results known
for tree decompositions of graphs to the case of hypergraphs and obtain fast
exact algorithms.
As a consequence, we provide algorithms which, given a hypergraph H on n
vertices and m hyperedges, compute the generalized hypertree-width of H in time
O*(2^n) and compute the fractional hypertree-width of H in time
O(m*1.734601^n).Comment: 12 pages, 1 figur
Euclidean Dynamical Triangulation revisited: is the phase transition really 1st order? (extended version)
The transition between the two phases of 4D Euclidean Dynamical Triangulation
[1] was long believed to be of second order until in 1996 first order behavior
was found for sufficiently large systems [5,9]. However, one may wonder if this
finding was affected by the numerical methods used: to control volume
fluctuations, in both studies [5,9] an artificial harmonic potential was added
to the action; in [9] measurements were taken after a fixed number of accepted
instead of attempted moves which introduces an additional error. Finally the
simulations suffer from strong critical slowing down which may have been
underestimated. In the present work, we address the above weaknesses: we allow
the volume to fluctuate freely within a fixed interval; we take measurements
after a fixed number of attempted moves; and we overcome critical slowing down
by using an optimized parallel tempering algorithm [12]. With these improved
methods, on systems of size up to 64k 4-simplices, we confirm that the phase
transition is first order.
In addition, we discuss a local criterion to decide whether parts of a
triangulation are in the elongated or crumpled state and describe a new
correspondence between EDT and the balls in boxes model. The latter gives rise
to a modified partition function with an additional, third coupling. Finally,
we propose and motivate a class of modified path-integral measures that might
remove the metastability of the Markov chain and turn the phase transition into
second order.Comment: 26 pages, 21 figures, extended version of arXiv:1311.471
Pre-processing for Triangulation of Probabilistic Networks
The currently most efficient algorithm for inference with a probabilistic
network builds upon a triangulation of a network's graph. In this paper, we
show that pre-processing can help in finding good triangulations
forprobabilistic networks, that is, triangulations with a minimal maximum
clique size. We provide a set of rules for stepwise reducing a graph, without
losing optimality. This reduction allows us to solve the triangulation problem
on a smaller graph. From the smaller graph's triangulation, a triangulation of
the original graph is obtained by reversing the reduction steps. Our
experimental results show that the graphs of some well-known real-life
probabilistic networks can be triangulated optimally just by preprocessing; for
other networks, huge reductions in their graph's size are obtained.Comment: Appears in Proceedings of the Seventeenth Conference on Uncertainty
in Artificial Intelligence (UAI2001
Efficient path consistency algorithm for large qualitative constraint networks
We propose a new algorithm called DPC+ to enforce partial path consistency (PPC) on qualitative constraint networks. PPC restricts path consistency (PC) to a triangulation of the underlying constraint graph of a network. As PPC retains the sparseness of a constraint graph, it can make reasoning tasks such as consistency checking and minimal labelling of large qualitative constraint networks much easier to tackle than PC. For qualitative constraint networks defined over any distributive subalgebra of well-known spatio-temporal calculi, such as the Region Connection Calculus and the Interval Algebra, we show that DPC+ can achieve PPC very fast. Indeed, the algorithm enforces PPC on a qualitative constraint network by processing each triangle in a triangulation of its underlying constraint graph at most three times. Our experiments demonstrate significant improvements of DPC+ over the state-of-the-art PPC enforcing algorithm
Improved Incremental Randomized Delaunay Triangulation
We propose a new data structure to compute the Delaunay triangulation of a
set of points in the plane. It combines good worst case complexity, fast
behavior on real data, and small memory occupation.
The location structure is organized into several levels. The lowest level
just consists of the triangulation, then each level contains the triangulation
of a small sample of the levels below. Point location is done by marching in a
triangulation to determine the nearest neighbor of the query at that level,
then the march restarts from that neighbor at the level below. Using a small
sample (3%) allows a small memory occupation; the march and the use of the
nearest neighbor to change levels quickly locate the query.Comment: 19 pages, 7 figures Proc. 14th Annu. ACM Sympos. Comput. Geom.,
106--115, 199
- …