540,019 research outputs found
Convex relaxation of mixture regression with efficient algorithms
We develop a convex relaxation of maximum a posteriori estimation of a mixture of regression models. Although our relaxation involves a semidefinite matrix variable, we reformulate the problem to eliminate the need for general semidefinite programming. In particular, we provide two reformulations that admit fast algorithms. The first is a max-min spectral reformulation exploiting quasi-Newton descent. The second is a min-min reformulation consisting of fast alternating steps of closed-form updates. We evaluate the methods against Expectation-Maximization in a real problem of motion segmentation from video data
Sketch-based Randomized Algorithms for Dynamic Graph Regression
A well-known problem in data science and machine learning is {\em linear
regression}, which is recently extended to dynamic graphs. Existing exact
algorithms for updating the solution of dynamic graph regression problem
require at least a linear time (in terms of : the size of the graph).
However, this time complexity might be intractable in practice. In the current
paper, we utilize {\em subsampled randomized Hadamard transform} and
\textsf{CountSketch} to propose the first randomized algorithms. Suppose that
we are given an matrix embedding of the graph, where .
Let be the number of samples required for a guaranteed approximation error,
which is a sublinear function of . Our first algorithm reduces time
complexity of pre-processing to .
Then after an edge insertion or an edge deletion, it updates the approximate
solution in time. Our second algorithm reduces time complexity of
pre-processing to , where is the number of nonzero elements of . Then after
an edge insertion or an edge deletion or a node insertion or a node deletion,
it updates the approximate solution in time, with
. Finally, we show
that under some assumptions, if our first algorithm
outperforms our second algorithm and if our second
algorithm outperforms our first algorithm
Regularization Paths for Generalized Linear Models via Coordinate Descent
We develop fast algorithms for estimation of generalized linear models with convex penalties. The models include linear regression, two-class logistic regression, and multi- nomial regression problems while the penalties include âÂÂ_1 (the lasso), âÂÂ_2 (ridge regression) and mixtures of the two (the elastic net). The algorithms use cyclical coordinate descent, computed along a regularization path. The methods can handle large problems and can also deal efficiently with sparse features. In comparative timings we find that the new algorithms are considerably faster than competing methods.
- …
