2 research outputs found
A Generalized Kernel Risk Sensitive Loss for Robust Two-Dimensional Singular Value Decomposition
Two-dimensional singular decomposition (2DSVD) has been widely used for image
processing tasks, such as image reconstruction, classification, and clustering.
However, traditional 2DSVD algorithm is based on the mean square error (MSE)
loss, which is sensitive to outliers. To overcome this problem, we propose a
robust 2DSVD framework based on a generalized kernel risk sensitive loss
(GKRSL-2DSVD) which is more robust to noise and and outliers. Since the
proposed objective function is non-convex, a majorization-minimization
algorithm is developed to efficiently solve it with guaranteed convergence. The
proposed framework has inherent properties of processing non-centered data,
rotational invariant, being easily extended to higher order spaces.
Experimental results on public databases demonstrate that the performance of
the proposed method on different applications significantly outperforms that of
all the benchmarks.Comment: Under Consideration by "Pattern Recognition
Decomposition into Low-rank plus Additive Matrices for Background/Foreground Separation: A Review for a Comparative Evaluation with a Large-Scale Dataset
Recent research on problem formulations based on decomposition into low-rank
plus sparse matrices shows a suitable framework to separate moving objects from
the background. The most representative problem formulation is the Robust
Principal Component Analysis (RPCA) solved via Principal Component Pursuit
(PCP) which decomposes a data matrix in a low-rank matrix and a sparse matrix.
However, similar robust implicit or explicit decompositions can be made in the
following problem formulations: Robust Non-negative Matrix Factorization
(RNMF), Robust Matrix Completion (RMC), Robust Subspace Recovery (RSR), Robust
Subspace Tracking (RST) and Robust Low-Rank Minimization (RLRM). The main goal
of these similar problem formulations is to obtain explicitly or implicitly a
decomposition into low-rank matrix plus additive matrices. In this context,
this work aims to initiate a rigorous and comprehensive review of the similar
problem formulations in robust subspace learning and tracking based on
decomposition into low-rank plus additive matrices for testing and ranking
existing algorithms for background/foreground separation. For this, we first
provide a preliminary review of the recent developments in the different
problem formulations which allows us to define a unified view that we called
Decomposition into Low-rank plus Additive Matrices (DLAM). Then, we examine
carefully each method in each robust subspace learning/tracking frameworks with
their decomposition, their loss functions, their optimization problem and their
solvers. Furthermore, we investigate if incremental algorithms and real-time
implementations can be achieved for background/foreground separation. Finally,
experimental results on a large-scale dataset called Background Models
Challenge (BMC 2012) show the comparative performance of 32 different robust
subspace learning/tracking methods.Comment: 121 pages, 5 figures, submitted to Computer Science Review. arXiv
admin note: text overlap with arXiv:1312.7167, arXiv:1109.6297,
arXiv:1207.3438, arXiv:1105.2126, arXiv:1404.7592, arXiv:1210.0805,
arXiv:1403.8067 by other authors, Computer Science Review, November 201