High-dimensional data in the form of tensors are challenging for kernel
classification methods. To both reduce the computational complexity and extract
informative features, kernels based on low-rank tensor decompositions have been
proposed. However, what decisive features of the tensors are exploited by these
kernels is often unclear. In this paper we propose a novel kernel that is based
on the Tucker decomposition. For this kernel the Tucker factors are computed
based on re-weighting of the Tucker matrices with tuneable powers of singular
values from the HOSVD decomposition. This provides a mechanism to balance the
contribution of the Tucker core and factors of the data. We benchmark support
tensor machines with this new kernel on several datasets. First we generate
synthetic data where two classes differ in either Tucker factors or core, and
compare our novel and previously existing kernels. We show robustness of the
new kernel with respect to both classification scenarios. We further test the
new method on real-world datasets. The proposed kernel has demonstrated a
higher test accuracy than the state-of-the-art tensor train multi-way
multi-level kernel, and a significantly lower computational time