181 research outputs found
The Convex Hull Problem in Practice : Improving the Running Time of the Double Description Method
The double description method is a simple but widely used algorithm for computation of extreme points in polyhedral sets. One key aspect of its implementation is the question of how to efficiently test extreme points for adjacency. In this dissertation, two significant contributions related to adjacency testing are presented. First, the currently used data structures are revisited and various optimizations are proposed. Empirical evidence is provided to demonstrate their competitiveness. Second, a new adjacency test is introduced. It is a refinement of the well known algebraic test featuring a technique for avoiding redundant computations. Its correctness is formally proven. Its superiority in multiple degenerate scenarios is demonstrated through experimental results. Parallel computation is one further aspect of the double description method covered in this work. A recently introduced divide-and-conquer technique is revisited and considerable practical limitations are demonstrated
Expansive Motions and the Polytope of Pointed Pseudo-Triangulations
We introduce the polytope of pointed pseudo-triangulations of a point set in
the plane, defined as the polytope of infinitesimal expansive motions of the
points subject to certain constraints on the increase of their distances. Its
1-skeleton is the graph whose vertices are the pointed pseudo-triangulations of
the point set and whose edges are flips of interior pseudo-triangulation edges.
For points in convex position we obtain a new realization of the
associahedron, i.e., a geometric representation of the set of triangulations of
an n-gon, or of the set of binary trees on n vertices, or of many other
combinatorial objects that are counted by the Catalan numbers. By considering
the 1-dimensional version of the polytope of constrained expansive motions we
obtain a second distinct realization of the associahedron as a perturbation of
the positive cell in a Coxeter arrangement.
Our methods produce as a by-product a new proof that every simple polygon or
polygonal arc in the plane has expansive motions, a key step in the proofs of
the Carpenter's Rule Theorem by Connelly, Demaine and Rote (2000) and by
Streinu (2000).Comment: 40 pages, 7 figures. Changes from v1: added some comments (specially
to the "Further remarks" in Section 5) + changed to final book format. This
version is to appear in "Discrete and Computational Geometry -- The
Goodman-Pollack Festschrift" (B. Aronov, S. Basu, J. Pach, M. Sharir, eds),
series "Algorithms and Combinatorics", Springer Verlag, Berli
An Efficient Algorithm for Vertex Enumeration of Arrangement
This paper presents a state-of-the-art algorithm for the vertex enumeration
problem of arrangements, which is based on the proposed new pivot rule, called
the Zero rule. The Zero rule possesses several desirable properties: i) It gets
rid of the objective function; ii) Its terminal satisfies uniqueness; iii) We
establish the if-and-only if condition between the Zero rule and its valid
reverse, which is not enjoyed by earlier rules; iv) Applying the Zero rule
recursively definitely terminates in steps, where is the dimension of
input variables. Because of so, given an arbitrary arrangement with
vertices of hyperplanes in , the algorithm's complexity is at
most and can be as low as if it is
a simple arrangement, while Moss' algorithm takes , and
Avis and Fukuda's algorithm goes into a loop or skips vertices because the
if-and-only-if condition between the rule they chose and its valid reverse is
not fulfilled. Systematic and comprehensive experiments confirm that the Zero
rule not only does not fail but also is the most efficient
Geometry of the set of quantum correlations
It is well known that correlations predicted by quantum mechanics cannot be
explained by any classical (local-realistic) theory. The relative strength of
quantum and classical correlations is usually studied in the context of Bell
inequalities, but this tells us little about the geometry of the quantum set of
correlations. In other words, we do not have good intuition about what the
quantum set actually looks like. In this paper we study the geometry of the
quantum set using standard tools from convex geometry. We find explicit
examples of rather counter-intuitive features in the simplest non-trivial Bell
scenario (two parties, two inputs and two outputs) and illustrate them using
2-dimensional slice plots. We also show that even more complex features appear
in Bell scenarios with more inputs or more parties. Finally, we discuss the
limitations that the geometry of the quantum set imposes on the task of
self-testing.Comment: 11 + 8 pages, 6 figures, v2: added an argument relating self-testing
and extremality, v3: typos corrected, results unchanged, published versio
Complexity of some polyhedral enumeration problems
In this thesis we consider the problem of converting the halfspace representation of a polytope to its vertex representation - the Vertex Enumeration problem - and various other basic and closely related computational problems about polytopes. The problem of converting the vertex representation to halfspace representation - the Convex Hull problem - is equivalent to vertex enumeration. In chapter 3 we prove that enumerating the vertices of an unbounded H-polyhedron P is NP-hard even if P has only 0=1 vertices. This strengthens a previous result of Khachiyan et. al. [KBB+06]. In chapters 4 to 6 we prove that many other operations on polytopes like computing the Minkowski sum, intersection, projection, etc. are NP-hard for many representations and equivalent to vertex enumeration in many others. In chapter 7 we prove various hardness results about a cone covering problem where one wants to check whether a given set of polyhedral cones cover another given set. We prove that in general this is an NP-complete problem and includes important problems like vertex enumeration and hypergraph transversal as special cases. Finally, in chapter 8 we relate the complexity of vertex enumeration to graph isomorphism by proving that a certain graph isomorphism hard problem is graph isomorphism easy if and only if vertex enumeration is graph isomorphism easy. We also answer a question of Kaibel and Schwartz about the complexity of checking self-duality of a polytope.In dieser Arbeit betrachten wir das Problem, die Halbraumdarstellung eines Polytops in seine Eckendarstellung umzuwandeln, - das sogenannte Problem der Eckenaufzählung - sowie viele andere grundlegende und eng verwandte Berechnungsprobleme für Polytope. Das Problem, die Eckendarstellung in die Halbraumdarstellung umzuwandeln - das sogenannte Konvexe-Hüllen Problem - ist äquivalent zum Problem der Eckenaufzählung. In Kapitel 3 zeigen wir, dass Eckenaufzählung für ein unbeschränktes H-Polyeder P selbst dann NP-schwer ist, wenn P nur 0=1-Ecken hat. Das verbessert ein Ergebnis von Khachiyan et. al. [KBB+06]. In den Kapiteln 4 bis 6 zeigen wir, dass viele andere Operationen auf Polytopen, wie Berechnung von Minkowski-Summe, Durchschnitt, Projektion usw., für viele Darstellungen NP-schwer sind und für viele weitere äquivalent zu Eckenaufzählung sind. In Kapitel 7 beweisen wir Härteresultate über ein Kegelüberdeckungsproblem, das danach fragt, ob eine gegebene Menge polyedrischer Kegel eine andere gegebene Menge überdeckt. Wir zeigen, dass dies im Allgemeinen ein NP-vollständiges Problem ist und wichtige Probleme wie Eckenaufzählung und Hypergraphentraversierung als Spezialfälle umfasst. Schließlich stellen wir in Kapitel 8 einen Zusammenhang zwischen Eckenaufzählung und Graphisomorphie her, indem wir beweisen, dass ein bestimmtes Graphisomorphie-schweres Problem genau dann Graphisomorphie-leicht ist, wenn Eckenaufzählung Graphisomorphie-leicht ist. Außerdem beantworten wir eine Frage von Kaibel und Schwartz über das Testen der Selbst-Dualität von Polytopen
Characterizing coherence, correcting incoherence
Lower previsions defined on a finite set of gambles can be looked at as points in a finite-dimensional real vector space. Within that vector space, the sets of sure loss avoiding and coherent lower previsions form convex polyhedra. We present procedures for obtaining characterizations of these polyhedra in terms of a minimal, finite number of linear constraints. As compared to the previously known procedure, these procedures are more efficient and much more straightforward. Next, we take a look at a procedure for correcting incoherent lower previsions based on pointwise dominance. This procedure can be formulated as a multi-objective linear program, and the availability of the finite characterizations provide an avenue for making these programs computationally feasible
- …