136 research outputs found

    A minor-monotone graph parameter based on oriented matroids

    Get PDF
    AbstractFor an undirected graph G = (V,E) let λ ′(G) be the largest d for which there exists an oriented matroid M on V of corank d such that for each nonzero vector (x+,x−) of M, x+ is nonempty and induces a connected subgraph of G.We show that λ′(G) is monotone under taking minors and clique sums. Moreover, we show that λ′(G) ⩽ 3 if and only if G has no K5- or V8-minor; that is, if and only if G arises from planar graphs by taking clique sums and subgraphs

    Tangle-tree duality: in graphs, matroids and beyond

    Full text link
    We apply a recent duality theorem for tangles in abstract separation systems to derive tangle-type duality theorems for width-parameters in graphs and matroids. We further derive a duality theorem for the existence of clusters in large data sets. Our applications to graphs include new, tangle-type, duality theorems for tree-width, path-width, and tree-decompositions of small adhesion. Conversely, we show that carving width is dual to edge-tangles. For matroids we obtain a duality theorem for tree-width. Our results can be used to derive short proofs of all the classical duality theorems for width parameters in graph minor theory, such as path-width, tree-width, branch-width and rank-width.Comment: arXiv admin note: text overlap with arXiv:1406.379

    Combinatorial geometry of neural codes, neural data analysis, and neural networks

    Full text link
    This dissertation explores applications of discrete geometry in mathematical neuroscience. We begin with convex neural codes, which model the activity of hippocampal place cells and other neurons with convex receptive fields. In Chapter 4, we introduce order-forcing, a tool for constraining convex realizations of codes, and use it to construct new examples of non-convex codes with no local obstructions. In Chapter 5, we relate oriented matroids to convex neural codes, showing that a code has a realization with convex polytopes iff it is the image of a representable oriented matroid under a neural code morphism. We also show that determining whether a code is convex is at least as difficult as determining whether an oriented matroid is representable, implying that the problem of determining whether a code is convex is NP-hard. Next, we turn to the problem of the underlying rank of a matrix. This problem is motivated by the problem of determining the dimensionality of (neural) data which has been corrupted by an unknown monotone transformation. In Chapter 6, we introduce two tools for computing underlying rank, the minimal nodes and the Radon rank. We apply these to analyze calcium imaging data from a larval zebrafish. In Chapter 7, we explore the underlying rank in more detail, establish connections to oriented matroid theory, and show that computing underlying rank is also NP-hard. Finally, we study the dynamics of threshold-linear networks (TLNs), a simple model of the activity of neural circuits. In Chapter 9, we describe the nullcline arrangement of a threshold linear network, and show that a subset of its chambers are an attracting set. In Chapter 10, we focus on combinatorial threshold linear networks (CTLNs), which are TLNs defined from a directed graph. We prove that if the graph of a CTLN is a directed acyclic graph, then all trajectories of the CTLN approach a fixed point.Comment: 193 pages, 69 figure

    Finite Volume Spaces and Sparsification

    Full text link
    We introduce and study finite dd-volumes - the high dimensional generalization of finite metric spaces. Having developed a suitable combinatorial machinery, we define 1\ell_1-volumes and show that they contain Euclidean volumes and hypertree volumes. We show that they can approximate any dd-volume with O(nd)O(n^d) multiplicative distortion. On the other hand, contrary to Bourgain's theorem for d=1d=1, there exists a 22-volume that on nn vertices that cannot be approximated by any 1\ell_1-volume with distortion smaller than Ω~(n1/5)\tilde{\Omega}(n^{1/5}). We further address the problem of 1\ell_1-dimension reduction in the context of 1\ell_1 volumes, and show that this phenomenon does occur, although not to the same striking degree as it does for Euclidean metrics and volumes. In particular, we show that any 1\ell_1 metric on nn points can be (1+ϵ)(1+ \epsilon)-approximated by a sum of O(n/ϵ2)O(n/\epsilon^2) cut metrics, improving over the best previously known bound of O(nlogn)O(n \log n) due to Schechtman. In order to deal with dimension reduction, we extend the techniques and ideas introduced by Karger and Bencz{\'u}r, and Spielman et al.~in the context of graph Sparsification, and develop general methods with a wide range of applications.Comment: previous revision was the wrong file: the new revision: changed (extended considerably) the treatment of finite volumes (see revised abstract). Inserted new applications for the sparsification technique
    corecore