154,069 research outputs found
Constrained low-rank representation for robust subspace clustering
Subspace clustering aims to partition the data points drawn from a union of subspaces according to their underlying subspaces. For accurate semi-supervised subspace clustering, all data that have a must-link constraint or the same label should be grouped into the same underlying subspace. However, this is not guaranteed in existing approaches. Moreover, these approaches require additional parameters for incorporating supervision information. In this paper, we propose a constrained low-rank representation (CLRR) for robust semi-supervised subspace clustering, based on a novel constraint matrix constructed in this paper. While seeking the low-rank representation of data, CLRR explicitly incorporates supervision information as hard constraints for enhancing the discriminating power of optimal representation. This strategy can be further extended to other state-of-the-art methods, such as sparse subspace clustering. We theoretically prove that the optimal representation matrix has both a block-diagonal structure with clean data and a semi-supervised grouping effect with noisy data. We have also developed an efficient optimization algorithm based on alternating the direction method of multipliers for CLRR. Our experimental results have demonstrated that CLRR outperforms existing methods
Multilevel quasiseparable matrices in PDE-constrained optimization
Optimization problems with constraints in the form of a partial differential
equation arise frequently in the process of engineering design. The
discretization of PDE-constrained optimization problems results in large-scale
linear systems of saddle-point type. In this paper we propose and develop a
novel approach to solving such systems by exploiting so-called quasiseparable
matrices. One may think of a usual quasiseparable matrix as of a discrete
analog of the Green's function of a one-dimensional differential operator. Nice
feature of such matrices is that almost every algorithm which employs them has
linear complexity. We extend the application of quasiseparable matrices to
problems in higher dimensions. Namely, we construct a class of preconditioners
which can be computed and applied at a linear computational cost. Their use
with appropriate Krylov methods leads to algorithms of nearly linear
complexity
- …