2,501 research outputs found
A globally convergent matricial algorithm for multivariate spectral estimation
In this paper, we first describe a matricial Newton-type algorithm designed
to solve the multivariable spectrum approximation problem. We then prove its
global convergence. Finally, we apply this approximation procedure to
multivariate spectral estimation, and test its effectiveness through
simulation. Simulation shows that, in the case of short observation records,
this method may provide a valid alternative to standard multivariable
identification techniques such as MATLAB's PEM and MATLAB's N4SID
Online Learning for Time Series Prediction
In this paper we address the problem of predicting a time series using the
ARMA (autoregressive moving average) model, under minimal assumptions on the
noise terms. Using regret minimization techniques, we develop effective online
learning algorithms for the prediction problem, without assuming that the noise
terms are Gaussian, identically distributed or even independent. Furthermore,
we show that our algorithm's performances asymptotically approaches the
performance of the best ARMA model in hindsight.Comment: 17 pages, 6 figure
Extrinsic Methods for Coding and Dictionary Learning on Grassmann Manifolds
Sparsity-based representations have recently led to notable results in
various visual recognition tasks. In a separate line of research, Riemannian
manifolds have been shown useful for dealing with features and models that do
not lie in Euclidean spaces. With the aim of building a bridge between the two
realms, we address the problem of sparse coding and dictionary learning over
the space of linear subspaces, which form Riemannian structures known as
Grassmann manifolds. To this end, we propose to embed Grassmann manifolds into
the space of symmetric matrices by an isometric mapping. This in turn enables
us to extend two sparse coding schemes to Grassmann manifolds. Furthermore, we
propose closed-form solutions for learning a Grassmann dictionary, atom by
atom. Lastly, to handle non-linearity in data, we extend the proposed Grassmann
sparse coding and dictionary learning algorithms through embedding into Hilbert
spaces.
Experiments on several classification tasks (gender recognition, gesture
classification, scene analysis, face recognition, action recognition and
dynamic texture classification) show that the proposed approaches achieve
considerable improvements in discrimination accuracy, in comparison to
state-of-the-art methods such as kernelized Affine Hull Method and
graph-embedding Grassmann discriminant analysis.Comment: Appearing in International Journal of Computer Visio
A new family of high-resolution multivariate spectral estimators
In this paper, we extend the Beta divergence family to multivariate power
spectral densities. Similarly to the scalar case, we show that it smoothly
connects the multivariate Kullback-Leibler divergence with the multivariate
Itakura-Saito distance. We successively study a spectrum approximation problem,
based on the Beta divergence family, which is related to a multivariate
extension of the THREE spectral estimation technique. It is then possible to
characterize a family of solutions to the problem. An upper bound on the
complexity of these solutions will also be provided. Simulations suggest that
the most suitable solution of this family depends on the specific features
required from the estimation problem
Statistical and Computational Tradeoff in Genetic Algorithm-Based Estimation
When a Genetic Algorithm (GA), or a stochastic algorithm in general, is
employed in a statistical problem, the obtained result is affected by both
variability due to sampling, that refers to the fact that only a sample is
observed, and variability due to the stochastic elements of the algorithm. This
topic can be easily set in a framework of statistical and computational
tradeoff question, crucial in recent problems, for which statisticians must
carefully set statistical and computational part of the analysis, taking
account of some resource or time constraints. In the present work we analyze
estimation problems tackled by GAs, for which variability of estimates can be
decomposed in the two sources of variability, considering some constraints in
the form of cost functions, related to both data acquisition and runtime of the
algorithm. Simulation studies will be presented to discuss the statistical and
computational tradeoff question.Comment: 17 pages, 5 figure
- …