28,844 research outputs found

    Computational and Statistical Thresholds in Multi-layer Stochastic Block Models

    Full text link
    We study the problem of community recovery and detection in multi-layer stochastic block models, focusing on the critical network density threshold for consistent community structure inference. Using a prototypical two-block model, we reveal a computational barrier for such multi-layer stochastic block models that does not exist for its single-layer counterpart: When there are no computational constraints, the density threshold depends linearly on the number of layers. However, when restricted to polynomial-time algorithms, the density threshold scales with the square root of the number of layers, assuming correctness of a low-degree polynomial hardness conjecture. Our results provide a nearly complete picture of the optimal inference in multiple-layer stochastic block models and partially settle the open question in Lei and Lin (2022) regarding the optimality of the bias-adjusted spectral method.Comment: 31 page

    Training Input-Output Recurrent Neural Networks through Spectral Methods

    Get PDF
    We consider the problem of training input-output recurrent neural networks (RNN) for sequence labeling tasks. We propose a novel spectral approach for learning the network parameters. It is based on decomposition of the cross-moment tensor between the output and a non-linear transformation of the input, based on score functions. We guarantee consistent learning with polynomial sample and computational complexity under transparent conditions such as non-degeneracy of model parameters, polynomial activations for the neurons, and a Markovian evolution of the input sequence. We also extend our results to Bidirectional RNN which uses both previous and future information to output the label at each time point, and is employed in many NLP tasks such as POS tagging

    Super-resolution Line Spectrum Estimation with Block Priors

    Full text link
    We address the problem of super-resolution line spectrum estimation of an undersampled signal with block prior information. The component frequencies of the signal are assumed to take arbitrary continuous values in known frequency blocks. We formulate a general semidefinite program to recover these continuous-valued frequencies using theories of positive trigonometric polynomials. The proposed semidefinite program achieves super-resolution frequency recovery by taking advantage of known structures of frequency blocks. Numerical experiments show great performance enhancements using our method.Comment: 7 pages, double colum

    Graph learning under sparsity priors

    Get PDF
    Graph signals offer a very generic and natural representation for data that lives on networks or irregular structures. The actual data structure is however often unknown a priori but can sometimes be estimated from the knowledge of the application domain. If this is not possible, the data structure has to be inferred from the mere signal observations. This is exactly the problem that we address in this paper, under the assumption that the graph signals can be represented as a sparse linear combination of a few atoms of a structured graph dictionary. The dictionary is constructed on polynomials of the graph Laplacian, which can sparsely represent a general class of graph signals composed of localized patterns on the graph. We formulate a graph learning problem, whose solution provides an ideal fit between the signal observations and the sparse graph signal model. As the problem is non-convex, we propose to solve it by alternating between a signal sparse coding and a graph update step. We provide experimental results that outline the good graph recovery performance of our method, which generally compares favourably to other recent network inference algorithms

    Projected Power Iteration for Network Alignment

    Full text link
    The network alignment problem asks for the best correspondence between two given graphs, so that the largest possible number of edges are matched. This problem appears in many scientific problems (like the study of protein-protein interactions) and it is very closely related to the quadratic assignment problem which has graph isomorphism, traveling salesman and minimum bisection problems as particular cases. The graph matching problem is NP-hard in general. However, under some restrictive models for the graphs, algorithms can approximate the alignment efficiently. In that spirit the recent work by Feizi and collaborators introduce EigenAlign, a fast spectral method with convergence guarantees for Erd\H{o}s-Reny\'i graphs. In this work we propose the algorithm Projected Power Alignment, which is a projected power iteration version of EigenAlign. We numerically show it improves the recovery rates of EigenAlign and we describe the theory that may be used to provide performance guarantees for Projected Power Alignment.Comment: 8 page

    Consistency of spectral clustering in stochastic block models

    Full text link
    We analyze the performance of spectral clustering for community extraction in stochastic block models. We show that, under mild conditions, spectral clustering applied to the adjacency matrix of the network can consistently recover hidden communities even when the order of the maximum expected degree is as small as logn\log n, with nn the number of nodes. This result applies to some popular polynomial time spectral clustering algorithms and is further extended to degree corrected stochastic block models using a spherical kk-median spectral clustering method. A key component of our analysis is a combinatorial bound on the spectrum of binary random matrices, which is sharper than the conventional matrix Bernstein inequality and may be of independent interest.Comment: Published in at http://dx.doi.org/10.1214/14-AOS1274 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org
    corecore