16,691 research outputs found
Semi-supervised cross-entropy clustering with information bottleneck constraint
In this paper, we propose a semi-supervised clustering method, CEC-IB, that
models data with a set of Gaussian distributions and that retrieves clusters
based on a partial labeling provided by the user (partition-level side
information). By combining the ideas from cross-entropy clustering (CEC) with
those from the information bottleneck method (IB), our method trades between
three conflicting goals: the accuracy with which the data set is modeled, the
simplicity of the model, and the consistency of the clustering with side
information. Experiments demonstrate that CEC-IB has a performance comparable
to Gaussian mixture models (GMM) in a classical semi-supervised scenario, but
is faster, more robust to noisy labels, automatically determines the optimal
number of clusters, and performs well when not all classes are present in the
side information. Moreover, in contrast to other semi-supervised models, it can
be successfully applied in discovering natural subgroups if the partition-level
side information is derived from the top levels of a hierarchical clustering
Relative Information Loss in the PCA
In this work we analyze principle component analysis (PCA) as a deterministic
input-output system. We show that the relative information loss induced by
reducing the dimensionality of the data after performing the PCA is the same as
in dimensionality reduction without PCA. Finally, we analyze the case where the
PCA uses the sample covariance matrix to compute the rotation. If the rotation
matrix is not available at the output, we show that an infinite amount of
information is lost. The relative information loss is shown to decrease with
increasing sample size.Comment: 9 pages, 4 figure; extended version of a paper accepted for
publicatio
- …