5,342 research outputs found
Laplacian Mixture Modeling for Network Analysis and Unsupervised Learning on Graphs
Laplacian mixture models identify overlapping regions of influence in
unlabeled graph and network data in a scalable and computationally efficient
way, yielding useful low-dimensional representations. By combining Laplacian
eigenspace and finite mixture modeling methods, they provide probabilistic or
fuzzy dimensionality reductions or domain decompositions for a variety of input
data types, including mixture distributions, feature vectors, and graphs or
networks. Provable optimal recovery using the algorithm is analytically shown
for a nontrivial class of cluster graphs. Heuristic approximations for scalable
high-performance implementations are described and empirically tested.
Connections to PageRank and community detection in network analysis demonstrate
the wide applicability of this approach. The origins of fuzzy spectral methods,
beginning with generalized heat or diffusion equations in physics, are reviewed
and summarized. Comparisons to other dimensionality reduction and clustering
methods for challenging unsupervised machine learning problems are also
discussed.Comment: 13 figures, 35 reference
Recovery of Low-Rank Matrices under Affine Constraints via a Smoothed Rank Function
In this paper, the problem of matrix rank minimization under affine
constraints is addressed. The state-of-the-art algorithms can recover matrices
with a rank much less than what is sufficient for the uniqueness of the
solution of this optimization problem. We propose an algorithm based on a
smooth approximation of the rank function, which practically improves recovery
limits on the rank of the solution. This approximation leads to a non-convex
program; thus, to avoid getting trapped in local solutions, we use the
following scheme. Initially, a rough approximation of the rank function subject
to the affine constraints is optimized. As the algorithm proceeds, finer
approximations of the rank are optimized and the solver is initialized with the
solution of the previous approximation until reaching the desired accuracy.
On the theoretical side, benefiting from the spherical section property, we
will show that the sequence of the solutions of the approximating function
converges to the minimum rank solution. On the experimental side, it will be
shown that the proposed algorithm, termed SRF standing for Smoothed Rank
Function, can recover matrices which are unique solutions of the rank
minimization problem and yet not recoverable by nuclear norm minimization.
Furthermore, it will be demonstrated that, in completing partially observed
matrices, the accuracy of SRF is considerably and consistently better than some
famous algorithms when the number of revealed entries is close to the minimum
number of parameters that uniquely represent a low-rank matrix.Comment: Accepted in IEEE TSP on December 4th, 201
- …