PhDIn this thesis we study the problem of learning under uncertainty using the statistical
learning paradigm. We rst propose a linear maximum margin classi er that deals
with uncertainty in data input. More speci cally, we reformulate the standard Support
Vector Machine (SVM) framework such that each training example can be modeled
by a multi-dimensional Gaussian distribution described by its mean vector and its
covariance matrix { the latter modeling the uncertainty. We address the classi cation
problem and de ne a cost function that is the expected value of the classical SVM
cost when data samples are drawn from the multi-dimensional Gaussian distributions
that form the set of the training examples. Our formulation approximates the classical
SVM formulation when the training examples are isotropic Gaussians with variance
tending to zero. We arrive at a convex optimization problem, which we solve e -
ciently in the primal form using a stochastic gradient descent approach. The resulting
classi er, which we name SVM with Gaussian Sample Uncertainty (SVM-GSU), is
tested on synthetic data and ve publicly available and popular datasets; namely, the
MNIST, WDBC, DEAP, TV News Channel Commercial Detection, and TRECVID
MED datasets. Experimental results verify the e ectiveness of the proposed method.
Next, we extended the aforementioned linear classi er so as to lead to non-linear decision
boundaries, using the RBF kernel. This extension, where we use isotropic input
uncertainty and we name Kernel SVM with Isotropic Gaussian Sample Uncertainty
(KSVM-iGSU), is used in the problems of video event detection and video aesthetic
quality assessment. The experimental results show that exploiting input uncertainty,
especially in problems where only a limited number of positive training examples are
provided, can lead to better classi cation, detection, or retrieval performance. Finally,
we present a preliminary study on how the above ideas can be used under the deep
convolutional neural networks learning paradigm so as to exploit inherent sources of
uncertainty, such as spatial pooling operations, that are usually used in deep networks