427,040 research outputs found
Statistical Mechanics of Time Domain Ensemble Learning
Conventional ensemble learning combines students in the space domain. On the
other hand, in this paper we combine students in the time domain and call it
time domain ensemble learning. In this paper, we analyze the generalization
performance of time domain ensemble learning in the framework of online
learning using a statistical mechanical method. We treat a model in which both
the teacher and the student are linear perceptrons with noises. Time domain
ensemble learning is twice as effective as conventional space domain ensemble
learning.Comment: 10 pages, 10 figure
Analysis of ensemble learning using simple perceptrons based on online learning theory
Ensemble learning of nonlinear perceptrons, which determine their outputs
by sign functions, is discussed within the framework of online learning and
statistical mechanics. One purpose of statistical learning theory is to
theoretically obtain the generalization error. This paper shows that ensemble
generalization error can be calculated by using two order parameters, that is,
the similarity between a teacher and a student, and the similarity among
students. The differential equations that describe the dynamical behaviors of
these order parameters are derived in the case of general learning rules. The
concrete forms of these differential equations are derived analytically in the
cases of three well-known rules: Hebbian learning, perceptron learning and
AdaTron learning. Ensemble generalization errors of these three rules are
calculated by using the results determined by solving their differential
equations. As a result, these three rules show different characteristics in
their affinity for ensemble learning, that is ``maintaining variety among
students." Results show that AdaTron learning is superior to the other two
rules with respect to that affinity.Comment: 30 pages, 17 figure
- …
