1 research outputs found
Complex-valued Neural Networks with Non-parametric Activation Functions
Complex-valued neural networks (CVNNs) are a powerful modeling tool for
domains where data can be naturally interpreted in terms of complex numbers.
However, several analytical properties of the complex domain (e.g.,
holomorphicity) make the design of CVNNs a more challenging task than their
real counterpart. In this paper, we consider the problem of flexible activation
functions (AFs) in the complex domain, i.e., AFs endowed with sufficient
degrees of freedom to adapt their shape given the training data. While this
problem has received considerable attention in the real case, a very limited
literature exists for CVNNs, where most activation functions are generally
developed in a split fashion (i.e., by considering the real and imaginary parts
of the activation separately) or with simple phase-amplitude techniques.
Leveraging over the recently proposed kernel activation functions (KAFs), and
related advances in the design of complex-valued kernels, we propose the first
fully complex, non-parametric activation function for CVNNs, which is based on
a kernel expansion with a fixed dictionary that can be implemented efficiently
on vectorized hardware. Several experiments on common use cases, including
prediction and channel equalization, validate our proposal when compared to
real-valued neural networks and CVNNs with fixed activation functions.Comment: Submitted to IEEE Transactions on Emerging Topics in Computational
Intelligenc