18,164 research outputs found
Scalable sparse covariance estimation via self-concordance
We consider the class of convex minimization problems, composed of a
self-concordant function, such as the metric, a convex data fidelity
term and, a regularizing -- possibly non-smooth -- function
. This type of problems have recently attracted a great deal of
interest, mainly due to their omnipresence in top-notch applications. Under
this \emph{locally} Lipschitz continuous gradient setting, we analyze the
convergence behavior of proximal Newton schemes with the added twist of a
probable presence of inexact evaluations. We prove attractive convergence rate
guarantees and enhance state-of-the-art optimization schemes to accommodate
such developments. Experimental results on sparse covariance estimation show
the merits of our algorithm, both in terms of recovery efficiency and
complexity.Comment: 7 pages, 1 figure, Accepted at AAAI-1
Regularized Principal Component Analysis for Spatial Data
In many atmospheric and earth sciences, it is of interest to identify
dominant spatial patterns of variation based on data observed at locations
and time points with the possibility that . While principal component
analysis (PCA) is commonly applied to find the dominant patterns, the
eigenimages produced from PCA may exhibit patterns that are too noisy to be
physically meaningful when is large relative to . To obtain more precise
estimates of eigenimages, we propose a regularization approach incorporating
smoothness and sparseness of eigenimages, while accounting for their
orthogonality. Our method allows data taken at irregularly spaced or sparse
locations. In addition, the resulting optimization problem can be solved using
the alternating direction method of multipliers, which is easy to implement,
and applicable to a large spatial dataset. Furthermore, the estimated
eigenfunctions provide a natural basis for representing the underlying spatial
process in a spatial random-effects model, from which spatial covariance
function estimation and spatial prediction can be efficiently performed using a
regularized fixed-rank kriging method. Finally, the effectiveness of the
proposed method is demonstrated by several numerical example
Sparse Inverse Covariance Selection via Alternating Linearization Methods
Gaussian graphical models are of great interest in statistical learning.
Because the conditional independencies between different nodes correspond to
zero entries in the inverse covariance matrix of the Gaussian distribution, one
can learn the structure of the graph by estimating a sparse inverse covariance
matrix from sample data, by solving a convex maximum likelihood problem with an
-regularization term. In this paper, we propose a first-order method
based on an alternating linearization technique that exploits the problem's
special structure; in particular, the subproblems solved in each iteration have
closed-form solutions. Moreover, our algorithm obtains an -optimal
solution in iterations. Numerical experiments on both synthetic
and real data from gene association networks show that a practical version of
this algorithm outperforms other competitive algorithms
- …