2,276 research outputs found
Classification via Incoherent Subspaces
This article presents a new classification framework that can extract
individual features per class. The scheme is based on a model of incoherent
subspaces, each one associated to one class, and a model on how the elements in
a class are represented in this subspace. After the theoretical analysis an
alternate projection algorithm to find such a collection is developed. The
classification performance and speed of the proposed method is tested on the AR
and YaleB databases and compared to that of Fisher's LDA and a recent approach
based on on minimisation. Finally connections of the presented scheme
to already existing work are discussed and possible ways of extensions are
pointed out.Comment: 22 pages, 2 figures, 4 table
Dimension Reduction by Mutual Information Discriminant Analysis
In the past few decades, researchers have proposed many discriminant analysis
(DA) algorithms for the study of high-dimensional data in a variety of
problems. Most DA algorithms for feature extraction are based on
transformations that simultaneously maximize the between-class scatter and
minimize the withinclass scatter matrices. This paper presents a novel DA
algorithm for feature extraction using mutual information (MI). However, it is
not always easy to obtain an accurate estimation for high-dimensional MI. In
this paper, we propose an efficient method for feature extraction that is based
on one-dimensional MI estimations. We will refer to this algorithm as mutual
information discriminant analysis (MIDA). The performance of this proposed
method was evaluated using UCI databases. The results indicate that MIDA
provides robust performance over different data sets with different
characteristics and that MIDA always performs better than, or at least
comparable to, the best performing algorithms.Comment: 13pages, 3 tables, International Journal of Artificial Intelligence &
Application
Global and Local Two-Sample Tests via Regression
Two-sample testing is a fundamental problem in statistics. Despite its long
history, there has been renewed interest in this problem with the advent of
high-dimensional and complex data. Specifically, in the machine learning
literature, there have been recent methodological developments such as
classification accuracy tests. The goal of this work is to present a regression
approach to comparing multivariate distributions of complex data. Depending on
the chosen regression model, our framework can efficiently handle different
types of variables and various structures in the data, with competitive power
under many practical scenarios. Whereas previous work has been largely limited
to global tests which conceal much of the local information, our approach
naturally leads to a local two-sample testing framework in which we identify
local differences between multivariate distributions with statistical
confidence. We demonstrate the efficacy of our approach both theoretically and
empirically, under some well-known parametric and nonparametric regression
methods. Our proposed methods are applied to simulated data as well as a
challenging astronomy data set to assess their practical usefulness
Joint cross-domain classification and subspace learning for unsupervised adaptation
Domain adaptation aims at adapting the knowledge acquired on a source domain
to a new different but related target domain. Several approaches have
beenproposed for classification tasks in the unsupervised scenario, where no
labeled target data are available. Most of the attention has been dedicated to
searching a new domain-invariant representation, leaving the definition of the
prediction function to a second stage. Here we propose to learn both jointly.
Specifically we learn the source subspace that best matches the target subspace
while at the same time minimizing a regularized misclassification loss. We
provide an alternating optimization technique based on stochastic sub-gradient
descent to solve the learning problem and we demonstrate its performance on
several domain adaptation tasks.Comment: Paper is under consideration at Pattern Recognition Letter
- …