706 research outputs found

    Theory and Implementation of Complex-Valued Neural Networks

    Full text link
    This work explains in detail the theory behind Complex-Valued Neural Network (CVNN), including Wirtinger calculus, complex backpropagation, and basic modules such as complex layers, complex activation functions, or complex weight initialization. We also show the impact of not adapting the weight initialization correctly to the complex domain. This work presents a strong focus on the implementation of such modules on Python using cvnn toolbox. We also perform simulations on real-valued data, casting to the complex domain by means of the Hilbert Transform, and verifying the potential interest of CVNN even for non-complex data.Comment: 42 pages, 18 figure

    The universal approximation theorem for complex-valued neural networks

    Full text link
    We generalize the classical universal approximation theorem for neural networks to the case of complex-valued neural networks. Precisely, we consider feedforward networks with a complex activation function σ:CC\sigma : \mathbb{C} \to \mathbb{C} in which each neuron performs the operation CNC,zσ(b+wTz)\mathbb{C}^N \to \mathbb{C}, z \mapsto \sigma(b + w^T z) with weights wCNw \in \mathbb{C}^N and a bias bCb \in \mathbb{C}, and with σ\sigma applied componentwise. We completely characterize those activation functions σ\sigma for which the associated complex networks have the universal approximation property, meaning that they can uniformly approximate any continuous function on any compact subset of Cd\mathbb{C}^d arbitrarily well. Unlike the classical case of real networks, the set of "good activation functions" which give rise to networks with the universal approximation property differs significantly depending on whether one considers deep networks or shallow networks: For deep networks with at least two hidden layers, the universal approximation property holds as long as σ\sigma is neither a polynomial, a holomorphic function, or an antiholomorphic function. Shallow networks, on the other hand, are universal if and only if the real part or the imaginary part of σ\sigma is not a polyharmonic function

    Global μ

    Get PDF
    The impulsive complex-valued neural networks with three kinds of time delays including leakage delay, discrete delay, and distributed delay are considered. Based on the homeomorphism mapping principle of complex domain, a sufficient condition for the existence and uniqueness of the equilibrium point of the addressed complex-valued neural networks is proposed in terms of linear matrix inequality (LMI). By constructing appropriate Lyapunov-Krasovskii functionals, and employing the free weighting matrix method, several delay-dependent criteria for checking the global μ-stability of the complex-valued neural networks are established in LMIs. As direct applications of these results, several criteria on the exponential stability, power-stability, and log-stability are obtained. Two examples with simulations are provided to demonstrate the effectiveness of the proposed criteria

    Complex-valued neural networks for fully-temporal micro-Doppler classification

    Get PDF
    International audienceMicro-Doppler analysis commonly makes use of the log-scaled, real-valued spectrogram, and recent work involving deep learning architectures for classification are no exception. Some works in neighboring fields of research directly exploit the raw temporal signal, but do not handle complex numbers, which are inherent to radar IQ signals. In this paper, we propose a complex-valued, fully temporal neural network which simultaneously exploits the raw signal and the spectrogram by introducing a Fourier-like layer suitable to deep architectures. We show improved results under certain conditions on synthetic radar data compared to a real-valued counterpart

    Towards Understanding Theoretical Advantages of Complex-Reaction Networks

    Full text link
    Complex-valued neural networks have attracted increasing attention in recent years, while it remains open on the advantages of complex-valued neural networks in comparison with real-valued networks. This work takes one step on this direction by introducing the \emph{complex-reaction network} with fully-connected feed-forward architecture. We prove the universal approximation property for complex-reaction networks, and show that a class of radial functions can be approximated by a complex-reaction network using the polynomial number of parameters, whereas real-valued networks need at least exponential parameters to reach the same approximation level. For empirical risk minimization, our theoretical result shows that the critical point set of complex-reaction networks is a proper subset of that of real-valued networks, which may show some insights on finding the optimal solutions more easily for complex-reaction networks

    Ensemble of Single‐Layered Complex‐Valued Neural Networks for Classification Tasks

    Get PDF
    This paper presents ensemble approaches in single-layered complex-valued neural network (CVNN) to solve real-valued classification problems. Each component CVNN of an ensemble uses a recently proposed activation function for its complex-valued neurons (CVNs). A gradient-descent based learning algorithm was used to train the component CVNNs. We applied two ensemble methods, negative correlation learning and bagging, to create the ensembles. Experimental results on a number of real-world benchmark problems showed a substantial performance improvement of the ensembles over the individual single-layered CVNN classifiers. Furthermore, the generalization performances were nearly equivalent to those obtained by the ensembles of real-valued multilayer neural networks
    corecore