6 research outputs found
Recommended from our members
Learning distance to subspace for the nearest subspace methods in high-dimensional data classification
The nearest subspace methods (NSM) are a category of classification methods widely applied to classify high-dimensional data. In this paper, we propose to improve the classification performance of NSM through learning tailored distance metrics from samples to class subspaces. The learned distance metric is termed as ‘learned distance to subspace’ (LD2S). Using LD2S in the classification rule of NSM can make the samples closer to their correct class subspaces while farther away from their wrong class subspaces. In this way, the classification task becomes easier and the classification performance of NSM can be improved. The superior classification performance of using LD2S for NSM is demonstrated on three real-world high-dimensional spectral datasets
Recommended from our members
A Novel Separating Hyperplane Classification Framework to Unify Nearest-class-model Methods for High-dimensional Data
In this paper, we establish a novel separating hyperplane classification (SHC) framework to unify three nearest-classmodel methods for high-dimensional data: the nearest subspace method (NSM), the nearest convex hull method (NCHM) and the nearest convex cone method (NCCM). Nearest-class-model methods are an important paradigm for classification of highdimensional data. We first introduce the three nearest-classmodel methods and then conduct dual analysis for theoretically investigating them, to understand deeply their underlying classification mechanisms. A new theorem for the dual analysis of NCCM is proposed in this paper, through discovering the relationship between a convex cone and its polar cone. We then establish the new SHC framework to unify the nearest-classmodel methods based on the theoretical results. One important application of this new SHC framework is to help explain empirical classification results: why one class model has better performance than others on certain datasets. Finally, we propose a new nearest-class-model method, the soft NCCM, under the novel SHC framework to solve the overlapping class model problem. For illustrative purposes, we empirically demonstrate the significance of our SHC framework and the soft NCCM through two types of typical real-world high-dimensional data, the spectroscopic data and the face image data
PETRELS: Parallel Subspace Estimation and Tracking by Recursive Least Squares from Partial Observations
Many real world data sets exhibit an embedding of low-dimensional structure
in a high-dimensional manifold. Examples include images, videos and internet
traffic data. It is of great significance to reduce the storage requirements
and computational complexity when the data dimension is high. Therefore we
consider the problem of reconstructing a data stream from a small subset of its
entries, where the data is assumed to lie in a low-dimensional linear subspace,
possibly corrupted by noise. We further consider tracking the change of the
underlying subspace, which can be applied to applications such as video
denoising, network monitoring and anomaly detection. Our problem can be viewed
as a sequential low-rank matrix completion problem in which the subspace is
learned in an on-line fashion. The proposed algorithm, dubbed Parallel
Estimation and Tracking by REcursive Least Squares (PETRELS), first identifies
the underlying low-dimensional subspace via a recursive procedure for each row
of the subspace matrix in parallel with discounting for previous observations,
and then reconstructs the missing entries via least-squares estimation if
required. Numerical examples are provided for direction-of-arrival estimation
and matrix completion, comparing PETRELS with state of the art batch
algorithms.Comment: submitted to IEEE Trans. Signal Processing. Part of the result was
reported at ICASSP 2012 and won the best student paper awar