136 research outputs found

    An SDP-Based Algorithm for Linear-Sized Spectral Sparsification

    Full text link
    For any undirected and weighted graph G=(V,E,w)G=(V,E,w) with nn vertices and mm edges, we call a sparse subgraph HH of GG, with proper reweighting of the edges, a (1+ε)(1+\varepsilon)-spectral sparsifier if (1−ε)x⊺LGx≤x⊺LHx≤(1+ε)x⊺LGx (1-\varepsilon)x^{\intercal}L_Gx\leq x^{\intercal} L_{H} x\leq (1+\varepsilon) x^{\intercal} L_Gx holds for any x∈Rnx\in\mathbb{R}^n, where LGL_G and LHL_{H} are the respective Laplacian matrices of GG and HH. Noticing that Ω(m)\Omega(m) time is needed for any algorithm to construct a spectral sparsifier and a spectral sparsifier of GG requires Ω(n)\Omega(n) edges, a natural question is to investigate, for any constant ε\varepsilon, if a (1+ε)(1+\varepsilon)-spectral sparsifier of GG with O(n)O(n) edges can be constructed in O~(m)\tilde{O}(m) time, where the O~\tilde{O} notation suppresses polylogarithmic factors. All previous constructions on spectral sparsification require either super-linear number of edges or m1+Ω(1)m^{1+\Omega(1)} time. In this work we answer this question affirmatively by presenting an algorithm that, for any undirected graph GG and ε>0\varepsilon>0, outputs a (1+ε)(1+\varepsilon)-spectral sparsifier of GG with O(n/ε2)O(n/\varepsilon^2) edges in O~(m/εO(1))\tilde{O}(m/\varepsilon^{O(1)}) time. Our algorithm is based on three novel techniques: (1) a new potential function which is much easier to compute yet has similar guarantees as the potential functions used in previous references; (2) an efficient reduction from a two-sided spectral sparsifier to a one-sided spectral sparsifier; (3) constructing a one-sided spectral sparsifier by a semi-definite program.Comment: To appear at STOC'1

    Augmenting the algebraic connectivity of graphs

    Get PDF
    For any undirected graph G=(V,E) and a set EW of candidate edges with E∩EW=∅, the (k,γ)-spectral augmentability problem is to find a set F of k edges from EW with appropriate weighting, such that the algebraic connectivity of the resulting graph H=(V,E∪F) is least γ. Because of a tight connection between the algebraic connectivity and many other graph parameters, including the graph's conductance and the mixing time of random walks in a graph, maximising the resulting graph's algebraic connectivity by adding a small number of edges has been studied over the past 15 years. In this work we present an approximate and efficient algorithm for the (k,γ)-spectral augmentability problem, and our algorithm runs in almost-linear time under a wide regime of parameters. Our main algorithm is based on the following two novel techniques developed in the paper, which might have applications beyond the (k,γ)-spectral augmentability problem. (1) We present a fast algorithm for solving a feasibility version of an SDP for the algebraic connectivity maximisation problem from [GB06]. Our algorithm is based on the classic primal-dual framework for solving SDP, which in turn uses the multiplicative weight update algorithm. We present a novel approach of unifying SDP constraints of different matrix and vector variables and give a good separation oracle accordingly. (2) We present an efficient algorithm for the subgraph sparsification problem, and for a wide range of parameters our algorithm runs in almost-linear time, in contrast to the previously best known algorithm running in at least Ω(n2mk) time [KMST10]. Our analysis shows how the randomised BSS framework can be generalised in the setting of subgraph sparsification, and how the potential functions can be applied to approximately keep track of different subspaces

    Oracle-Based Primal-Dual Algorithms for Packing and Covering Semidefinite Programs

    Get PDF
    Packing and covering semidefinite programs (SDPs) appear in natural relaxations of many combinatorial optimization problems as well as a number of other applications. Recently, several techniques were proposed, that utilize the particular structure of this class of problems, to obtain more efficient algorithms than those offered by general SDP solvers. For certain applications, such as those described in this paper, it maybe required to deal with SDP\u27s with exponentially or infinitely many constraints, which are accessible only via an oracle. In this paper, we give an efficient primal-dual algorithm to solve the problem in this case, which is an extension of a logarithmic-potential based algorithm of Grigoriadis, Khachiyan, Porkolab and Villavicencio (SIAM Journal of Optimization 41 (2001)) for packing/covering linear programs

    Similarity-Aware Spectral Sparsification by Edge Filtering

    Full text link
    In recent years, spectral graph sparsification techniques that can compute ultra-sparse graph proxies have been extensively studied for accelerating various numerical and graph-related applications. Prior nearly-linear-time spectral sparsification methods first extract low-stretch spanning tree from the original graph to form the backbone of the sparsifier, and then recover small portions of spectrally-critical off-tree edges to the spanning tree to significantly improve the approximation quality. However, it is not clear how many off-tree edges should be recovered for achieving a desired spectral similarity level within the sparsifier. Motivated by recent graph signal processing techniques, this paper proposes a similarity-aware spectral graph sparsification framework that leverages efficient spectral off-tree edge embedding and filtering schemes to construct spectral sparsifiers with guaranteed spectral similarity (relative condition number) level. An iterative graph densification scheme is introduced to facilitate efficient and effective filtering of off-tree edges for highly ill-conditioned problems. The proposed method has been validated using various kinds of graphs obtained from public domain sparse matrix collections relevant to VLSI CAD, finite element analysis, as well as social and data networks frequently studied in many machine learning and data mining applications

    Spectral Sparsification and Regret Minimization Beyond Matrix Multiplicative Updates

    Full text link
    In this paper, we provide a novel construction of the linear-sized spectral sparsifiers of Batson, Spielman and Srivastava [BSS14]. While previous constructions required Ω(n4)\Omega(n^4) running time [BSS14, Zou12], our sparsification routine can be implemented in almost-quadratic running time O(n2+ε)O(n^{2+\varepsilon}). The fundamental conceptual novelty of our work is the leveraging of a strong connection between sparsification and a regret minimization problem over density matrices. This connection was known to provide an interpretation of the randomized sparsifiers of Spielman and Srivastava [SS11] via the application of matrix multiplicative weight updates (MWU) [CHS11, Vis14]. In this paper, we explain how matrix MWU naturally arises as an instance of the Follow-the-Regularized-Leader framework and generalize this approach to yield a larger class of updates. This new class allows us to accelerate the construction of linear-sized spectral sparsifiers, and give novel insights on the motivation behind Batson, Spielman and Srivastava [BSS14]
    • …
    corecore