1 research outputs found
Minor Constraint Disturbances for Deep Semi-supervised Learning
In high-dimensional data space, semi-supervised feature learning based on
Euclidean distance shows instability under a broad set of conditions.
Furthermore, the scarcity and high cost of labels prompt us to explore new
semi-supervised learning methods with the fewest labels. In this paper, we
develop a novel Minor Constraint Disturbances-based Deep Semi-supervised
Feature Learning framework (MCD-DSFL) from the perspective of probability
distribution for feature representation. There are two fundamental modules in
the proposed framework: one is a Minor Constraint Disturbances-based restricted
Boltzmann machine with Gaussian visible units (MCDGRBM) for modelling
continuous data and the other is a Minor Constraint Disturbances-based
restricted Boltzmann machine (MCDRBM) for modelling binary data. The Minor
Constraint Disturbances (MCD) consist of less instance-level constraints which
are produced by only two randomly selected labels from each class. The
Kullback-Leibler (KL) divergences of the MCD are fused into the Contrastive
Divergence (CD) learning for training the proposed MCDGRBM and MCDRBM models.
Then, the probability distributions of hidden layer features are as similar as
possible in the same class and they are as dissimilar as possible in the
different classes simultaneously. Despite the weak influence of the MCD for our
shallow models (MCDGRBM and MCDRBM), the proposed deep MCD-DSFL framework
improves the representation capability significantly under its leverage effect.
The semi-supervised strategy based on the KL divergence of the MCD
significantly reduces the reliance on the labels and improves the stability of
the semi-supervised feature learning in high-dimensional space simultaneously.Comment: 14 page