158 research outputs found
Distributed Adaptive Learning with Multiple Kernels in Diffusion Networks
We propose an adaptive scheme for distributed learning of nonlinear functions
by a network of nodes. The proposed algorithm consists of a local adaptation
stage utilizing multiple kernels with projections onto hyperslabs and a
diffusion stage to achieve consensus on the estimates over the whole network.
Multiple kernels are incorporated to enhance the approximation of functions
with several high and low frequency components common in practical scenarios.
We provide a thorough convergence analysis of the proposed scheme based on the
metric of the Cartesian product of multiple reproducing kernel Hilbert spaces.
To this end, we introduce a modified consensus matrix considering this specific
metric and prove its equivalence to the ordinary consensus matrix. Besides, the
use of hyperslabs enables a significant reduction of the computational demand
with only a minor loss in the performance. Numerical evaluations with synthetic
and real data are conducted showing the efficacy of the proposed algorithm
compared to the state of the art schemes.Comment: Double-column 15 pages, 10 figures, submitted to IEEE Trans. Signal
Processin
Recovering Latent Signals from a Mixture of Measurements using a Gaussian Process Prior
In sensing applications, sensors cannot always measure the latent quantity of
interest at the required resolution, sometimes they can only acquire a blurred
version of it due the sensor's transfer function. To recover latent signals
when only noisy mixed measurements of the signal are available, we propose the
Gaussian process mixture of measurements (GPMM), which models the latent signal
as a Gaussian process (GP) and allows us to perform Bayesian inference on such
signal conditional to a set of noisy mixture of measurements. We describe how
to train GPMM, that is, to find the hyperparameters of the GP and the mixing
weights, and how to perform inference on the latent signal under GPMM;
additionally, we identify the solution to the underdetermined linear system
resulting from a sensing application as a particular case of GPMM. The proposed
model is validated in the recovery of three signals: a smooth synthetic signal,
a real-world heart-rate time series and a step function, where GPMM
outperformed the standard GP in terms of estimation error, uncertainty
representation and recovery of the spectral content of the latent signal.Comment: Published on IEEE Signal Processing Letters on Dec. 201
A New Adaptive LSSVR with Online Multikernel RBF Tuning to Evaluate Analog Circuit Performance
Focusing on the analog circuit performance evaluation demand of fast time responding online, a novel evaluation strategy based on adaptive Least Squares Support Vector Regression (LSSVR) which employs multikernel RBF is proposed in this paper. The superiority of the multi-kernel RBF has more flexibility to the kernel function online such as the bandwidths tuning. And then the decision parameters of the kernel parameters determine the input signal to map to the feature space deduced that a well plant model by discarding redundant features. Experiment adopted the typical circuit Sallen-Key low pass filter to prove the proposed evaluation strategy via the eight performance indexes. Simulation results reveal that the testing speed together with the evaluation performance, especially the testing speed of the proposed, is superior to that of the traditional LSSVR and ε-SVR, which is suitable for promotion online
Analyzing sparse dictionaries for online learning with kernels
Many signal processing and machine learning methods share essentially the
same linear-in-the-parameter model, with as many parameters as available
samples as in kernel-based machines. Sparse approximation is essential in many
disciplines, with new challenges emerging in online learning with kernels. To
this end, several sparsity measures have been proposed in the literature to
quantify sparse dictionaries and constructing relevant ones, the most prolific
ones being the distance, the approximation, the coherence and the Babel
measures. In this paper, we analyze sparse dictionaries based on these
measures. By conducting an eigenvalue analysis, we show that these sparsity
measures share many properties, including the linear independence condition and
inducing a well-posed optimization problem. Furthermore, we prove that there
exists a quasi-isometry between the parameter (i.e., dual) space and the
dictionary's induced feature space.Comment: 10 page
Sensorimotor coding of vermal granule neurons in the developing mammalian cerebellum
The vermal cerebellum is a hub of sensorimotor integration critical for postural control and locomotion, but the nature and developmental organization of afferent information to this region have remained poorly understoo
A stochastic behavior analysis of stochastic restricted-gradient descent algorithm in reproducing kernel Hilbert spaces
This paper presents a stochastic behavior analysis of a kernel-based
stochastic restricted-gradient descent method. The restricted gradient gives a
steepest ascent direction within the so-called dictionary subspace. The
analysis provides the transient and steady state performance in the mean
squared error criterion. It also includes stability conditions in the mean and
mean-square sense. The present study is based on the analysis of the kernel
normalized least mean square (KNLMS) algorithm initially proposed by Chen et
al. Simulation results validate the analysis
- …