6 research outputs found

    Widely Linear Kernels for Complex-Valued Kernel Activation Functions

    Full text link
    Complex-valued neural networks (CVNNs) have been shown to be powerful nonlinear approximators when the input data can be properly modeled in the complex domain. One of the major challenges in scaling up CVNNs in practice is the design of complex activation functions. Recently, we proposed a novel framework for learning these activation functions neuron-wise in a data-dependent fashion, based on a cheap one-dimensional kernel expansion and the idea of kernel activation functions (KAFs). In this paper we argue that, despite its flexibility, this framework is still limited in the class of functions that can be modeled in the complex domain. We leverage the idea of widely linear complex kernels to extend the formulation, allowing for a richer expressiveness without an increase in the number of adaptable parameters. We test the resulting model on a set of complex-valued image classification benchmarks. Experimental results show that the resulting CVNNs can achieve higher accuracy while at the same time converging faster.Comment: Accepted at ICASSP 201

    Learning ground states of gapped quantum Hamiltonians with Kernel Methods

    Full text link
    Neural network approaches to approximate the ground state of quantum hamiltonians require the numerical solution of a highly nonlinear optimization problem. We introduce a statistical learning approach that makes the optimization trivial by using kernel methods. Our scheme is an approximate realization of the power method, where supervised learning is used to learn the next step of the power iteration. We show that the ground state properties of arbitrary gapped quantum hamiltonians can be reached with polynomial resources under the assumption that the supervised learning is efficient. Using kernel ridge regression, we provide numerical evidence that the learning assumption is verified by applying our scheme to find the ground states of several prototypical interacting many-body quantum systems, both in one and two dimensions, showing the flexibility of our approach

    Widely-Linear MMSE Estimation of Complex-Valued Graph Signals

    Full text link
    In this paper, we consider the problem of recovering random graph signals with complex values. For general Bayesian estimation of complex-valued vectors, it is known that the widely-linear minimum mean-squared-error (WLMMSE) estimator can achieve a lower mean-squared-error (MSE) than that of the linear minimum MSE (LMMSE) estimator. Inspired by the WLMMSE estimator, in this paper we develop the graph signal processing (GSP)-WLMMSE estimator, which minimizes the MSE among estimators that are represented as a two-channel output of a graph filter, i.e. widely-linear GSP estimators. We discuss the properties of the proposed GSP-WLMMSE estimator. In particular, we show that the MSE of the GSP-WLMMSE estimator is always equal to or lower than the MSE of the GSP-LMMSE estimator. The GSP-WLMMSE estimator is based on diagonal covariance matrices in the graph frequency domain, and thus has reduced complexity compared with the WLMMSE estimator. This property is especially important when using the sample-mean versions of these estimators that are based on a training dataset. We then state conditions under which the low-complexity GSP-WLMMSE estimator coincides with the WLMMSE estimator. In the simulations, we investigate two synthetic estimation problems (with linear and nonlinear models) and the problem of state estimation in power systems. For these problems, it is shown that the GSP-WLMMSE estimator outperforms the GSP-LMMSE estimator and achieves similar performance to that of the WLMMSE estimator.Comment: This work has been submitted to the IEEE for possible publication. Copyright may be transferred without notice, after which this version may no longer be accessibl

    The Generalized Complex Kernel Least-Mean-Square Algorithm

    Get PDF
    We propose a novel adaptive kernel based regression method for complex-valued signals: the generalized complex-valued kernel least-mean-square (gCKLMS). We borrow from the new results on widely linear reproducing kernel Hilbert space (WL-RKHS) for nonlinear regression and complex-valued signals, recently proposed by the authors. This paper shows that in the adaptive version of the kernel regression for complex-valued signals we need to include another kernel term, the so-called pseudo-kernel. This new solution is endowed with better representation capabilities in complex-valued fields, since it can efficiently decouple the learning of the real and the imaginary part. Also, we review previous realizations of the complex KLMS algorithm and its augmented version to prove that they can be rewritten as particular cases of the gCKLMS. Furthermore, important conclusions on the kernels design are drawn that help to greatly improve the convergence of the algorithms. In the experiments, we revisit the nonlinear channel equalization problem to highlight the better convergence of the gCKLMS compared to previous solutions. Also, the flexibility of the proposed generalized approach is tested in a second experiment with non-independent real and imaginary parts. The results illustrate the significant performance improvements of the gCKLMS approach when the complex-valued signals have different properties for the real and imaginary parts.Comment: Submitted to IEEE Transactions on Signal Processin
    corecore