Kernel methods are powerful learning techniques with excellent generalization capability. This thesis develops three advanced approaches within the generic SVM framework in the application domain of time series data.
The first contribution presents a new methodology for incorporating privileged information about the future evolution of time series, which is only available in the training phase. The task is prediction of the ordered categories of future time series movements. This is implemented by directly extending support vector ordinal regression with implicit constraints to leaning using privileged information paradigm.
The second contribution demonstrates a novel methodology of constructing efficient kernels for time series classification problems. These kernels are constructed by representing each time series through a linear readout model from a high dimensional state space model with a fixed deterministically constructed dynamic part. Learning is then performed in the linear readout model space.
Finally, in the same context, we introduce yet another novel time series kernel by co-learning the dynamic part and a global metric in the linear readout model space, encouraging time series from the same class to be represented by close model representations, while model representations of time series from different classes to be well-separated