In this paper, we resolve many of the key algorithmic questions regarding
robustness, memory efficiency, and differential privacy of tensor
decomposition. We propose simple variants of the tensor power method which
enjoy these strong properties. We present the first guarantees for online
tensor power method which has a linear memory requirement. Moreover, we present
a noise calibrated tensor power method with efficient privacy guarantees. At
the heart of all these guarantees lies a careful perturbation analysis derived
in this paper which improves up on the existing results significantly.Comment: 19 pages, 9 figures. To appear at the 30th Annual Conference on
Advances in Neural Information Processing Systems (NIPS 2016), to be held at
Barcelona, Spain. Fix small typos in proofs of Lemmas C.5 and C.