49 research outputs found
Efficient Inexact Proximal Gradient Algorithm for Nonconvex Problems
The proximal gradient algorithm has been popularly used for convex
optimization. Recently, it has also been extended for nonconvex problems, and
the current state-of-the-art is the nonmonotone accelerated proximal gradient
algorithm. However, it typically requires two exact proximal steps in each
iteration, and can be inefficient when the proximal step is expensive. In this
paper, we propose an efficient proximal gradient algorithm that requires only
one inexact (and thus less expensive) proximal step in each iteration.
Convergence to a critical point %of the nonconvex problem is still guaranteed
and has a convergence rate, which is the best rate for nonconvex
problems with first-order methods. Experiments on a number of problems
demonstrate that the proposed algorithm has comparable performance as the
state-of-the-art, but is much faster
Expectile Matrix Factorization for Skewed Data Analysis
Matrix factorization is a popular approach to solving matrix estimation
problems based on partial observations. Existing matrix factorization is based
on least squares and aims to yield a low-rank matrix to interpret the
conditional sample means given the observations. However, in many real
applications with skewed and extreme data, least squares cannot explain their
central tendency or tail distributions, yielding undesired estimates. In this
paper, we propose \emph{expectile matrix factorization} by introducing
asymmetric least squares, a key concept in expectile regression analysis, into
the matrix factorization framework. We propose an efficient algorithm to solve
the new problem based on alternating minimization and quadratic programming. We
prove that our algorithm converges to a global optimum and exactly recovers the
true underlying low-rank matrices when noise is zero. For synthetic data with
skewed noise and a real-world dataset containing web service response times,
the proposed scheme achieves lower recovery errors than the existing matrix
factorization method based on least squares in a wide range of settings.Comment: 8 page main text with 5 page supplementary documents, published in
AAAI 201