14,805 research outputs found
Non-recursive equivalent of the conjugate gradient method without the need to restart
A simple alternative to the conjugate gradient(CG) method is presented; this
method is developed as a special case of the more general iterated Ritz method
(IRM) for solving a system of linear equations. This novel algorithm is not
based on conjugacy, i.e. it is not necessary to maintain overall
orthogonalities between various vectors from distant steps. This method is more
stable than CG, and restarting techniques are not required. As in CG, only one
matrix-vector multiplication is required per step with appropriate
transformations. The algorithm is easily explained by energy considerations
without appealing to the A-orthogonality in n-dimensional space. Finally,
relaxation factor and preconditioning-like techniques can be adopted easily.Comment: 9 page
Recovery Guarantees for Quadratic Tensors with Limited Observations
We consider the tensor completion problem of predicting the missing entries
of a tensor. The commonly used CP model has a triple product form, but an
alternate family of quadratic models which are the sum of pairwise products
instead of a triple product have emerged from applications such as
recommendation systems. Non-convex methods are the method of choice for
learning quadratic models, and this work examines their sample complexity and
error guarantee. Our main result is that with the number of samples being only
linear in the dimension, all local minima of the mean squared error objective
are global minima and recover the original tensor accurately. The techniques
lead to simple proofs showing that convex relaxation can recover quadratic
tensors provided with linear number of samples. We substantiate our theoretical
results with experiments on synthetic and real-world data, showing that
quadratic models have better performance than CP models in scenarios where
there are limited amount of observations available
- …