6 research outputs found
Concentration inequalities for random tensors
We show how to extend several basic concentration inequalities for simple
random tensors where all are
independent random vectors in with independent coefficients. The
new results have optimal dependence on the dimension and the degree . As
an application, we show that random tensors are well conditioned: independent copies of the simple random tensor
are far from being linearly dependent with high probability. We prove this fact
for any degree and conjecture that it is true for any
.Comment: A few more typos were correcte
Smoothed analysis of discrete tensor decomposition and assemblies of neurons
Β© 2018 Curran Associates Inc.All rights reserved. We analyze linear independence of rank one tensors produced by tensor powers of randomly perturbed vectors. This enables efficient decomposition of sums of high-order tensors. Our analysis builds upon Bhaskara et al. [3] but allows for a wider range of perturbation models, including discrete ones. We give an application to recovering assemblies of neurons. Assemblies are large sets of neurons representing specific memories or concepts. The size of the intersection of two assemblies has been shown in experiments to represent the extent to which these memories co-occur or these concepts are related; the phenomenon is called association of assemblies. This suggests that an animal's memory is a complex web of associations, and poses the problem of recovering this representation from cognitive data. Motivated by this problem, we study the following more general question: Can we reconstruct the Venn diagram of a family of sets, given the sizes of their `-wise intersections? We show that as long as the family of sets is randomly perturbed, it is enough for the number of measurements to be polynomially larger than the number of nonempty regions of the Venn diagram to fully reconstruct the diagram