38,114 research outputs found

    New graph invariants based on pp-Laplacian eigenvalues

    Full text link
    We present monotonicity inequalities for certain functions involving eigenvalues of pp-Laplacians on signed graphs with respect to pp. Inspired by such monotonicity, we propose new spectrum-based graph invariants, called (variational) cut-off adjacency eigenvalues, that are relevant to certain eigenvector-dependent nonlinear eigenvalue problem. Using these invariants, we obtain new lower bounds for the pp-Laplacian variational eigenvalues, essentially giving the state-of-the-art spectral asymptotics for these eigenvalues. Moreover, based on such invariants, we establish two inertia bounds regarding the cardinalities of a maximum independent set and a minimum edge cover, respectively. The first inertia bound enhances the classical Cvetkovi\'c bound, and the second one implies that the kk-th pp-Laplacian variational eigenvalue is of the order 2p2^p as pp tends to infinity whenever kk is larger than the cardinality of a minimum edge cover of the underlying graph. We further discover an interesting connection between graph pp-Laplacian eigenvalues and tensor eigenvalues and discuss applications of our invariants to spectral problems of tensors.Comment: 30 page

    Probabilistic Spectral Sparsification In Sublinear Time

    Full text link
    In this paper, we introduce a variant of spectral sparsification, called probabilistic (ε,δ)(\varepsilon,\delta)-spectral sparsification. Roughly speaking, it preserves the cut value of any cut (S,Sc)(S,S^{c}) with an 1±ε1\pm\varepsilon multiplicative error and a δS\delta\left|S\right| additive error. We show how to produce a probabilistic (ε,δ)(\varepsilon,\delta)-spectral sparsifier with O(nlogn/ε2)O(n\log n/\varepsilon^{2}) edges in time O~(n/ε2δ)\tilde{O}(n/\varepsilon^{2}\delta) time for unweighted undirected graph. This gives fastest known sub-linear time algorithms for different cut problems on unweighted undirected graph such as - An O~(n/OPT+n3/2+t)\tilde{O}(n/OPT+n^{3/2+t}) time O(logn/t)O(\sqrt{\log n/t})-approximation algorithm for the sparsest cut problem and the balanced separator problem. - A n1+o(1)/ε4n^{1+o(1)}/\varepsilon^{4} time approximation minimum s-t cut algorithm with an εn\varepsilon n additive error

    Large-scale Binary Quadratic Optimization Using Semidefinite Relaxation and Applications

    Full text link
    In computer vision, many problems such as image segmentation, pixel labelling, and scene parsing can be formulated as binary quadratic programs (BQPs). For submodular problems, cuts based methods can be employed to efficiently solve large-scale problems. However, general nonsubmodular problems are significantly more challenging to solve. Finding a solution when the problem is of large size to be of practical interest, however, typically requires relaxation. Two standard relaxation methods are widely used for solving general BQPs--spectral methods and semidefinite programming (SDP), each with their own advantages and disadvantages. Spectral relaxation is simple and easy to implement, but its bound is loose. Semidefinite relaxation has a tighter bound, but its computational complexity is high, especially for large scale problems. In this work, we present a new SDP formulation for BQPs, with two desirable properties. First, it has a similar relaxation bound to conventional SDP formulations. Second, compared with conventional SDP methods, the new SDP formulation leads to a significantly more efficient and scalable dual optimization approach, which has the same degree of complexity as spectral methods. We then propose two solvers, namely, quasi-Newton and smoothing Newton methods, for the dual problem. Both of them are significantly more efficiently than standard interior-point methods. In practice, the smoothing Newton solver is faster than the quasi-Newton solver for dense or medium-sized problems, while the quasi-Newton solver is preferable for large sparse/structured problems. Our experiments on a few computer vision applications including clustering, image segmentation, co-segmentation and registration show the potential of our SDP formulation for solving large-scale BQPs.Comment: Fixed some typos. 18 pages. Accepted to IEEE Transactions on Pattern Analysis and Machine Intelligenc

    Many Sparse Cuts via Higher Eigenvalues

    Full text link
    Cheeger's fundamental inequality states that any edge-weighted graph has a vertex subset SS such that its expansion (a.k.a. conductance) is bounded as follows: \phi(S) \defeq \frac{w(S,\bar{S})}{\min \set{w(S), w(\bar{S})}} \leq 2\sqrt{\lambda_2} where ww is the total edge weight of a subset or a cut and λ2\lambda_2 is the second smallest eigenvalue of the normalized Laplacian of the graph. Here we prove the following natural generalization: for any integer k[n]k \in [n], there exist ckck disjoint subsets S1,...,SckS_1, ..., S_{ck}, such that maxiϕ(Si)Cλklogk \max_i \phi(S_i) \leq C \sqrt{\lambda_{k} \log k} where λi\lambda_i is the ithi^{th} smallest eigenvalue of the normalized Laplacian and c0c0 are suitable absolute constants. Our proof is via a polynomial-time algorithm to find such subsets, consisting of a spectral projection and a randomized rounding. As a consequence, we get the same upper bound for the small set expansion problem, namely for any kk, there is a subset SS whose weight is at most a \bigO(1/k) fraction of the total weight and ϕ(S)Cλklogk\phi(S) \le C \sqrt{\lambda_k \log k}. Both results are the best possible up to constant factors. The underlying algorithmic problem, namely finding kk subsets such that the maximum expansion is minimized, besides extending sparse cuts to more than one subset, appears to be a natural clustering problem in its own right

    Improved Cheeger's Inequality: Analysis of Spectral Partitioning Algorithms through Higher Order Spectral Gap

    Get PDF
    Let \phi(G) be the minimum conductance of an undirected graph G, and let 0=\lambda_1 <= \lambda_2 <=... <= \lambda_n <= 2 be the eigenvalues of the normalized Laplacian matrix of G. We prove that for any graph G and any k >= 2, \phi(G) = O(k) \lambda_2 / \sqrt{\lambda_k}, and this performance guarantee is achieved by the spectral partitioning algorithm. This improves Cheeger's inequality, and the bound is optimal up to a constant factor for any k. Our result shows that the spectral partitioning algorithm is a constant factor approximation algorithm for finding a sparse cut if \lambda_k$ is a constant for some constant k. This provides some theoretical justification to its empirical performance in image segmentation and clustering problems. We extend the analysis to other graph partitioning problems, including multi-way partition, balanced separator, and maximum cut

    Linear orderings of random geometric graphs (extended abstract)

    Get PDF
    In random geometric graphs, vertices are randomly distributed on [0,1]^2 and pairs of vertices are connected by edges whenever they are sufficiently close together. Layout problems seek a linear ordering of the vertices of a graph such that a certain measure is minimized. In this paper, we study several layout problems on random geometric graphs: Bandwidth, Minimum Linear Arrangement, Minimum Cut, Minimum Sum Cut, Vertex Separation and Bisection. We first prove that some of these problems remain \NP-complete even for geometric graphs. Afterwards, we compute lower bounds that hold with high probability on random geometric graphs. Finally, we characterize the probabilistic behavior of the lexicographic ordering for our layout problems on the class of random geometric graphs.Postprint (published version
    corecore