1,193 research outputs found
Optimal Query Complexity for Reconstructing Hypergraphs
In this paper we consider the problem of reconstructing a hidden weighted
hypergraph of constant rank using additive queries. We prove the following: Let
be a weighted hidden hypergraph of constant rank with n vertices and
hyperedges. For any there exists a non-adaptive algorithm that finds the
edges of the graph and their weights using
additive queries. This solves the open problem in [S. Choi, J. H. Kim. Optimal
Query Complexity Bounds for Finding Graphs. {\em STOC}, 749--758,~2008].
When the weights of the hypergraph are integers that are less than
where is the rank of the hypergraph (and therefore for
unweighted hypergraphs) there exists a non-adaptive algorithm that finds the
edges of the graph and their weights using additive queries.
Using the information theoretic bound the above query complexities are tight
Spectral Detection on Sparse Hypergraphs
We consider the problem of the assignment of nodes into communities from a
set of hyperedges, where every hyperedge is a noisy observation of the
community assignment of the adjacent nodes. We focus in particular on the
sparse regime where the number of edges is of the same order as the number of
vertices. We propose a spectral method based on a generalization of the
non-backtracking Hashimoto matrix into hypergraphs. We analyze its performance
on a planted generative model and compare it with other spectral methods and
with Bayesian belief propagation (which was conjectured to be asymptotically
optimal for this model). We conclude that the proposed spectral method detects
communities whenever belief propagation does, while having the important
advantages to be simpler, entirely nonparametric, and to be able to learn the
rule according to which the hyperedges were generated without prior
information.Comment: 8 pages, 5 figure
Provable Bounds for Learning Some Deep Representations
We give algorithms with provable guarantees that learn a class of deep nets
in the generative model view popularized by Hinton and others. Our generative
model is an node multilayer neural net that has degree at most
for some and each edge has a random edge weight in . Our
algorithm learns {\em almost all} networks in this class with polynomial
running time. The sample complexity is quadratic or cubic depending upon the
details of the model.
The algorithm uses layerwise learning. It is based upon a novel idea of
observing correlations among features and using these to infer the underlying
edge structure via a global graph recovery procedure. The analysis of the
algorithm reveals interesting structure of neural networks with random edge
weights.Comment: The first 18 pages serve as an extended abstract and a 36 pages long
technical appendix follow
Consistency of Spectral Hypergraph Partitioning under Planted Partition Model
Hypergraph partitioning lies at the heart of a number of problems in machine
learning and network sciences. Many algorithms for hypergraph partitioning have
been proposed that extend standard approaches for graph partitioning to the
case of hypergraphs. However, theoretical aspects of such methods have seldom
received attention in the literature as compared to the extensive studies on
the guarantees of graph partitioning. For instance, consistency results of
spectral graph partitioning under the stochastic block model are well known. In
this paper, we present a planted partition model for sparse random non-uniform
hypergraphs that generalizes the stochastic block model. We derive an error
bound for a spectral hypergraph partitioning algorithm under this model using
matrix concentration inequalities. To the best of our knowledge, this is the
first consistency result related to partitioning non-uniform hypergraphs.Comment: 35 pages, 2 figures, 1 tabl
Learning and Testing Variable Partitions
Let be a multivariate function from a product set to an
Abelian group . A -partition of with cost is a partition of
the set of variables into non-empty subsets such that is -close to
for some with
respect to a given error metric. We study algorithms for agnostically learning
partitions and testing -partitionability over various groups and error
metrics given query access to . In particular we show that
Given a function that has a -partition of cost , a partition
of cost can be learned in time
for any .
In contrast, for and learning a partition of cost is NP-hard.
When is real-valued and the error metric is the 2-norm, a
2-partition of cost can be learned in time
.
When is -valued and the error metric is Hamming
weight, -partitionability is testable with one-sided error and
non-adaptive queries. We also show that even
two-sided testers require queries when .
This work was motivated by reinforcement learning control tasks in which the
set of control variables can be partitioned. The partitioning reduces the task
into multiple lower-dimensional ones that are relatively easier to learn. Our
second algorithm empirically increases the scores attained over previous
heuristic partitioning methods applied in this context.Comment: Innovations in Theoretical Computer Science (ITCS) 202
Duality of Graphical Models and Tensor Networks
In this article we show the duality between tensor networks and undirected
graphical models with discrete variables. We study tensor networks on
hypergraphs, which we call tensor hypernetworks. We show that the tensor
hypernetwork on a hypergraph exactly corresponds to the graphical model given
by the dual hypergraph. We translate various notions under duality. For
example, marginalization in a graphical model is dual to contraction in the
tensor network. Algorithms also translate under duality. We show that belief
propagation corresponds to a known algorithm for tensor network contraction.
This article is a reminder that the research areas of graphical models and
tensor networks can benefit from interaction
- …