31,444 research outputs found
Spectral Methods from Tensor Networks
A tensor network is a diagram that specifies a way to "multiply" a collection
of tensors together to produce another tensor (or matrix). Many existing
algorithms for tensor problems (such as tensor decomposition and tensor PCA),
although they are not presented this way, can be viewed as spectral methods on
matrices built from simple tensor networks. In this work we leverage the full
power of this abstraction to design new algorithms for certain continuous
tensor decomposition problems.
An important and challenging family of tensor problems comes from orbit
recovery, a class of inference problems involving group actions (inspired by
applications such as cryo-electron microscopy). Orbit recovery problems over
finite groups can often be solved via standard tensor methods. However, for
infinite groups, no general algorithms are known. We give a new spectral
algorithm based on tensor networks for one such problem: continuous
multi-reference alignment over the infinite group SO(2). Our algorithm extends
to the more general heterogeneous case.Comment: 30 pages, 8 figure
Tensor Spectral Clustering for Partitioning Higher-order Network Structures
Spectral graph theory-based methods represent an important class of tools for
studying the structure of networks. Spectral methods are based on a first-order
Markov chain derived from a random walk on the graph and thus they cannot take
advantage of important higher-order network substructures such as triangles,
cycles, and feed-forward loops. Here we propose a Tensor Spectral Clustering
(TSC) algorithm that allows for modeling higher-order network structures in a
graph partitioning framework. Our TSC algorithm allows the user to specify
which higher-order network structures (cycles, feed-forward loops, etc.) should
be preserved by the network clustering. Higher-order network structures of
interest are represented using a tensor, which we then partition by developing
a multilinear spectral method. Our framework can be applied to discovering
layered flows in networks as well as graph anomaly detection, which we
illustrate on synthetic networks. In directed networks, a higher-order
structure of particular interest is the directed 3-cycle, which captures
feedback loops in networks. We demonstrate that our TSC algorithm produces
large partitions that cut fewer directed 3-cycles than standard spectral
clustering algorithms.Comment: SDM 201
A framework for second-order eigenvector centralities and clustering coefficients
We propose and analyse a general tensor-based framework for incorporating second-order features into network measures. This approach allows us to combine traditional pairwise links with information that records whether triples of nodes are involved in wedges or triangles. Our treatment covers classical spectral methods and recently proposed cases from the literature, but we also identify many interesting extensions. In particular, we define a mutually reinforcing (spectral) version of the classical clustering coefficient. The underlying object of study is a constrained nonlinear eigenvalue problem associated with a cubic tensor. Using recent results from nonlinear Perron–Frobenius theory, we establish existence and uniqueness under appropriate conditions, and show that the new spectral measures can be computed efficiently with a nonlinear power method. To illustrate the added value of the new formulation, we analyse the measures on a class of synthetic networks. We also give computational results on centrality and link prediction for real-world networks
Training Input-Output Recurrent Neural Networks through Spectral Methods
We consider the problem of training input-output recurrent neural networks
(RNN) for sequence labeling tasks. We propose a novel spectral approach for
learning the network parameters. It is based on decomposition of the
cross-moment tensor between the output and a non-linear transformation of the
input, based on score functions. We guarantee consistent learning with
polynomial sample and computational complexity under transparent conditions
such as non-degeneracy of model parameters, polynomial activations for the
neurons, and a Markovian evolution of the input sequence. We also extend our
results to Bidirectional RNN which uses both previous and future information to
output the label at each time point, and is employed in many NLP tasks such as
POS tagging
Tensor Analysis and Fusion of Multimodal Brain Images
Current high-throughput data acquisition technologies probe dynamical systems
with different imaging modalities, generating massive data sets at different
spatial and temporal resolutions posing challenging problems in multimodal data
fusion. A case in point is the attempt to parse out the brain structures and
networks that underpin human cognitive processes by analysis of different
neuroimaging modalities (functional MRI, EEG, NIRS etc.). We emphasize that the
multimodal, multi-scale nature of neuroimaging data is well reflected by a
multi-way (tensor) structure where the underlying processes can be summarized
by a relatively small number of components or "atoms". We introduce
Markov-Penrose diagrams - an integration of Bayesian DAG and tensor network
notation in order to analyze these models. These diagrams not only clarify
matrix and tensor EEG and fMRI time/frequency analysis and inverse problems,
but also help understand multimodal fusion via Multiway Partial Least Squares
and Coupled Matrix-Tensor Factorization. We show here, for the first time, that
Granger causal analysis of brain networks is a tensor regression problem, thus
allowing the atomic decomposition of brain networks. Analysis of EEG and fMRI
recordings shows the potential of the methods and suggests their use in other
scientific domains.Comment: 23 pages, 15 figures, submitted to Proceedings of the IEE
- …