1,546 research outputs found

    Rebuilding convex sets in graphs

    Get PDF
    The usual distance between pairs of vertices in a graph naturally gives rise to the notion of an interval between a pair of vertices in a graph. This in turn allows us to extend the notions of convex sets, convex hull, and extreme points in Euclidean space to the vertex set of a graph. The extreme vertices of a graph are known to be precisely the simplicial vertices, i.e., the vertices whose neighborhoods are complete graphs. It is known that the class of graphs with the Minkowski–Krein–Milman property, i.e., the property that every convex set is the convex hull of its extreme points, is precisely the class of chordal graphs without induced 3-fans. We define a vertex to be a contour vertex if the eccentricity of every neighbor is at most as large as that of the vertex. In this paper we show that every convex set of vertices in a graph is the convex hull of the collection of its contour vertices. We characterize those graphs for which every convex set has the property that its contour vertices coincide with its extreme points. A set of vertices in a graph is a geodetic set if the union of the intervals between pairs of vertices in the set, taken over all pairs in the set, is the entire vertex set. We show that the contour vertices in distance hereditary graphs form a geodetic set

    On geodesic and monophonic convexity

    Get PDF
    In this paper we deal with two types of graph convexities, which are the most natural path convexities in a graph and which are defined by a system P of paths in a connected graph G: the geodesic convexity (also called metric convexity) which arises when we consider shortest paths, and the monophonic convexity (also called minimal path convexity) when we consider chordless paths. First, we present a realization theorem proving, that there is no general relationship between monophonic and geodetic hull sets. Second, we study the contour of a graph, showing that the contour must be monophonic. Finally, we consider the so-called edge Steiner sets. We prove that every edge Steiner set is edge monophonic.Ministerio de Ciencia y TecnologĂ­aFondo Europeo de Desarrollo RegionalGeneralitat de Cataluny

    A Unified View of Piecewise Linear Neural Network Verification

    Full text link
    The success of Deep Learning and its potential use in many safety-critical applications has motivated research on formal verification of Neural Network (NN) models. Despite the reputation of learned NN models to behave as black boxes and the theoretical hardness of proving their properties, researchers have been successful in verifying some classes of models by exploiting their piecewise linear structure and taking insights from formal methods such as Satisifiability Modulo Theory. These methods are however still far from scaling to realistic neural networks. To facilitate progress on this crucial area, we make two key contributions. First, we present a unified framework that encompasses previous methods. This analysis results in the identification of new methods that combine the strengths of multiple existing approaches, accomplishing a speedup of two orders of magnitude compared to the previous state of the art. Second, we propose a new data set of benchmarks which includes a collection of previously released testcases. We use the benchmark to provide the first experimental comparison of existing algorithms and identify the factors impacting the hardness of verification problems.Comment: Updated version of "Piecewise Linear Neural Network verification: A comparative study

    Fast Hierarchical Clustering and Other Applications of Dynamic Closest Pairs

    Full text link
    We develop data structures for dynamic closest pair problems with arbitrary distance functions, that do not necessarily come from any geometric structure on the objects. Based on a technique previously used by the author for Euclidean closest pairs, we show how to insert and delete objects from an n-object set, maintaining the closest pair, in O(n log^2 n) time per update and O(n) space. With quadratic space, we can instead use a quadtree-like structure to achieve an optimal time bound, O(n) per update. We apply these data structures to hierarchical clustering, greedy matching, and TSP heuristics, and discuss other potential applications in machine learning, Groebner bases, and local improvement algorithms for partition and placement problems. Experiments show our new methods to be faster in practice than previously used heuristics.Comment: 20 pages, 9 figures. A preliminary version of this paper appeared at the 9th ACM-SIAM Symp. on Discrete Algorithms, San Francisco, 1998, pp. 619-628. For source code and experimental results, see http://www.ics.uci.edu/~eppstein/projects/pairs
    • …
    corecore