1 research outputs found
Neural Networks with Complex-Valued Weights Have No Spurious Local Minima
We study the benefits of complex-valued weights for neural networks. We prove
that shallow complex neural networks with quadratic activations have no
spurious local minima. In contrast, shallow real neural networks with quadratic
activations have infinitely many spurious local minima under the same
conditions. In addition, we provide specific examples to demonstrate that
complex-valued weights turn poor local minima into saddle points. The
activation function CReLU is also discussed to illustrate the superiority of
analytic activations in complex-valued neural networks