11,606 research outputs found
Discriminative Features via Generalized Eigenvectors
Representing examples in a way that is compatible with the underlying
classifier can greatly enhance the performance of a learning system. In this
paper we investigate scalable techniques for inducing discriminative features
by taking advantage of simple second order structure in the data. We focus on
multiclass classification and show that features extracted from the generalized
eigenvectors of the class conditional second moments lead to classifiers with
excellent empirical performance. Moreover, these features have attractive
theoretical properties, such as inducing representations that are invariant to
linear transformations of the input. We evaluate classifiers built from these
features on three different tasks, obtaining state of the art results
KCRC-LCD: Discriminative Kernel Collaborative Representation with Locality Constrained Dictionary for Visual Categorization
We consider the image classification problem via kernel collaborative
representation classification with locality constrained dictionary (KCRC-LCD).
Specifically, we propose a kernel collaborative representation classification
(KCRC) approach in which kernel method is used to improve the discrimination
ability of collaborative representation classification (CRC). We then measure
the similarities between the query and atoms in the global dictionary in order
to construct a locality constrained dictionary (LCD) for KCRC. In addition, we
discuss several similarity measure approaches in LCD and further present a
simple yet effective unified similarity measure whose superiority is validated
in experiments. There are several appealing aspects associated with LCD. First,
LCD can be nicely incorporated under the framework of KCRC. The LCD similarity
measure can be kernelized under KCRC, which theoretically links CRC and LCD
under the kernel method. Second, KCRC-LCD becomes more scalable to both the
training set size and the feature dimension. Example shows that KCRC is able to
perfectly classify data with certain distribution, while conventional CRC fails
completely. Comprehensive experiments on many public datasets also show that
KCRC-LCD is a robust discriminative classifier with both excellent performance
and good scalability, being comparable or outperforming many other
state-of-the-art approaches
Non-Gaussian Discriminative Factor Models via the Max-Margin Rank-Likelihood
We consider the problem of discriminative factor analysis for data that are
in general non-Gaussian. A Bayesian model based on the ranks of the data is
proposed. We first introduce a new {\em max-margin} version of the
rank-likelihood. A discriminative factor model is then developed, integrating
the max-margin rank-likelihood and (linear) Bayesian support vector machines,
which are also built on the max-margin principle. The discriminative factor
model is further extended to the {\em nonlinear} case through mixtures of local
linear classifiers, via Dirichlet processes. Fully local conjugacy of the model
yields efficient inference with both Markov Chain Monte Carlo and variational
Bayes approaches. Extensive experiments on benchmark and real data demonstrate
superior performance of the proposed model and its potential for applications
in computational biology.Comment: 14 pages, 7 figures, ICML 201
- …