7,767 research outputs found
Data-adaptive harmonic spectra and multilayer Stuart-Landau models
Harmonic decompositions of multivariate time series are considered for which
we adopt an integral operator approach with periodic semigroup kernels.
Spectral decomposition theorems are derived that cover the important cases of
two-time statistics drawn from a mixing invariant measure.
The corresponding eigenvalues can be grouped per Fourier frequency, and are
actually given, at each frequency, as the singular values of a cross-spectral
matrix depending on the data. These eigenvalues obey furthermore a variational
principle that allows us to define naturally a multidimensional power spectrum.
The eigenmodes, as far as they are concerned, exhibit a data-adaptive character
manifested in their phase which allows us in turn to define a multidimensional
phase spectrum.
The resulting data-adaptive harmonic (DAH) modes allow for reducing the
data-driven modeling effort to elemental models stacked per frequency, only
coupled at different frequencies by the same noise realization. In particular,
the DAH decomposition extracts time-dependent coefficients stacked by Fourier
frequency which can be efficiently modeled---provided the decay of temporal
correlations is sufficiently well-resolved---within a class of multilayer
stochastic models (MSMs) tailored here on stochastic Stuart-Landau oscillators.
Applications to the Lorenz 96 model and to a stochastic heat equation driven
by a space-time white noise, are considered. In both cases, the DAH
decomposition allows for an extraction of spatio-temporal modes revealing key
features of the dynamics in the embedded phase space. The multilayer
Stuart-Landau models (MSLMs) are shown to successfully model the typical
patterns of the corresponding time-evolving fields, as well as their statistics
of occurrence.Comment: 26 pages, double columns; 15 figure
Tensor Decompositions for Signal Processing Applications From Two-way to Multiway Component Analysis
The widespread use of multi-sensor technology and the emergence of big
datasets has highlighted the limitations of standard flat-view matrix models
and the necessity to move towards more versatile data analysis tools. We show
that higher-order tensors (i.e., multiway arrays) enable such a fundamental
paradigm shift towards models that are essentially polynomial and whose
uniqueness, unlike the matrix methods, is guaranteed under verymild and natural
conditions. Benefiting fromthe power ofmultilinear algebra as theirmathematical
backbone, data analysis techniques using tensor decompositions are shown to
have great flexibility in the choice of constraints that match data properties,
and to find more general latent components in the data than matrix-based
methods. A comprehensive introduction to tensor decompositions is provided from
a signal processing perspective, starting from the algebraic foundations, via
basic Canonical Polyadic and Tucker models, through to advanced cause-effect
and multi-view data analysis schemes. We show that tensor decompositions enable
natural generalizations of some commonly used signal processing paradigms, such
as canonical correlation and subspace techniques, signal separation, linear
regression, feature extraction and classification. We also cover computational
aspects, and point out how ideas from compressed sensing and scientific
computing may be used for addressing the otherwise unmanageable storage and
manipulation problems associated with big datasets. The concepts are supported
by illustrative real world case studies illuminating the benefits of the tensor
framework, as efficient and promising tools for modern signal processing, data
analysis and machine learning applications; these benefits also extend to
vector/matrix data through tensorization. Keywords: ICA, NMF, CPD, Tucker
decomposition, HOSVD, tensor networks, Tensor Train
- …