3,971 research outputs found
A three domain covariance framework for EEG/MEG data
In this paper we introduce a covariance framework for the analysis of EEG and
MEG data that takes into account observed temporal stationarity on small time
scales and trial-to-trial variations. We formulate a model for the covariance
matrix, which is a Kronecker product of three components that correspond to
space, time and epochs/trials, and consider maximum likelihood estimation of
the unknown parameter values. An iterative algorithm that finds approximations
of the maximum likelihood estimates is proposed. We perform a simulation study
to assess the performance of the estimator and investigate the influence of
different assumptions about the covariance factors on the estimated covariance
matrix and on its components. Apart from that, we illustrate our method on real
EEG and MEG data sets.
The proposed covariance model is applicable in a variety of cases where
spontaneous EEG or MEG acts as source of noise and realistic noise covariance
estimates are needed for accurate dipole localization, such as in evoked
activity studies, or where the properties of spontaneous EEG or MEG are
themselves the topic of interest, such as in combined EEG/fMRI experiments in
which the correlation between EEG and fMRI signals is investigated.Comment: 25 pages, 8 figures, 1 tabl
Do We Really Need Both BEKK and DCC? A Tale of Two Multivariate GARCH Models
The management and monitoring of very large portfolios of financial assets are routine for many individuals and organizations. The two most widely used models of conditional covariances and correlations in the class of multivariate GARCH models are BEKK and DCC. It is well known that BEKK suffers from the archetypal “curse of dimensionalityâ€, whereas DCC does not. It is argued in this paper that this is a misleading interpretation of the suitability of the two models for use in practice. The primary purpose of this paper is to analyze the similarities and dissimilarities between BEKK and DCC, both with and without targeting, on the basis of the structural derivation of the models, the availability of analytical forms for the sufficient conditions for existence of moments, sufficient conditions for consistency and asymptotic normality of the appropriate estimators, and computational tractability for ultra large numbers of financial assets. Based on theoretical considerations, the paper sheds light on how to discriminate between BEKK and DCC in practical applications.forecasting;conditional correlations;Hadamard models;conditional covariances;diagonal models;generalized models;scalar models;targeting
"Do We Really Need Both BEKK and DCC? A Tale of Two Multivariate GARCH Models"
The management and monitoring of very large portfolios of financial assets are routine for many individuals and organizations. The two most widely used models of conditional covariances and correlations in the class of multivariate GARCH models are BEKK and DCC. It is well known that BEKK suffers from the archetypal "curse of dimensionality", whereas DCC does not. It is argued in this paper that this is a misleading interpretation of the suitability of the two models for use in practice. The primary purpose of this paper is to analyze the similarities and dissimilarities between BEKK and DCC, both with and without targeting, on the basis of the structural derivation of the models, the availability of analytical forms for the sufficient conditions for existence of moments, sufficient conditions for consistency and asymptotic normality of the appropriate estimators, and computational tractability for ultra large numbers of financial assets. Based on theoretical considerations, the paper sheds light on how to discriminate between BEKK and DCC in practical applications.
Do We Really Need Both BEKK and DCC? A Tale of Two Covariance Models
Large and very large portfolios of financial assets are routine for many individuals and organizations. The two most widely used models of conditional covariances and correlations are BEKK and DCC. BEKK suffers from the archetypal "curse of dimensionality" whereas DCC does not. This is a misleading interpretation of the suitability of the two models to be used in practice. The primary purposes of the paper are to define targeting as an aid in estimating matrices associated with large numbers of financial assets, analyze the similarities and dissimilarities between BEKK and DCC, both with and without targeting, on the basis of structural derivation, the analytical forms of the sufficient conditions for the existence of moments, and the sufficient conditions for consistency and asymptotic normality, and computational tractability for very large (that is, ultra high) numbers of financial assets, to present a consistent two step estimation method for the DCC model, and to determine whether BEKK or DCC should be preferred in practical applications.
Neural Connectivity with Hidden Gaussian Graphical State-Model
The noninvasive procedures for neural connectivity are under questioning.
Theoretical models sustain that the electromagnetic field registered at
external sensors is elicited by currents at neural space. Nevertheless, what we
observe at the sensor space is a superposition of projected fields, from the
whole gray-matter. This is the reason for a major pitfall of noninvasive
Electrophysiology methods: distorted reconstruction of neural activity and its
connectivity or leakage. It has been proven that current methods produce
incorrect connectomes. Somewhat related to the incorrect connectivity
modelling, they disregard either Systems Theory and Bayesian Information
Theory. We introduce a new formalism that attains for it, Hidden Gaussian
Graphical State-Model (HIGGS). A neural Gaussian Graphical Model (GGM) hidden
by the observation equation of Magneto-encephalographic (MEEG) signals. HIGGS
is equivalent to a frequency domain Linear State Space Model (LSSM) but with
sparse connectivity prior. The mathematical contribution here is the theory for
high-dimensional and frequency-domain HIGGS solvers. We demonstrate that HIGGS
can attenuate the leakage effect in the most critical case: the distortion EEG
signal due to head volume conduction heterogeneities. Its application in EEG is
illustrated with retrieved connectivity patterns from human Steady State Visual
Evoked Potentials (SSVEP). We provide for the first time confirmatory evidence
for noninvasive procedures of neural connectivity: concurrent EEG and
Electrocorticography (ECoG) recordings on monkey. Open source packages are
freely available online, to reproduce the results presented in this paper and
to analyze external MEEG databases
Do We Really Need Both BEKK and DCC? A Tale of Two Covariance Models
Large and very large portfolios of financial assets are routine for many individuals and organizations. The two most widely used models of conditional covariances and correlations are BEKK and DCC. BEKK suffers from the archetypal "curse of dimensionality" whereas DCC does not. This is a misleading interpretation of the suitability of the two models to be used in practice. The primary purposes of the paper are to define targeting as an aid in estimating matrices associated with large numbers of financial assets, analyze the similarities and dissimilarities between BEKK and DCC, both with and without targeting, on the basis of structural derivation, the analytical forms of the sufficient conditions for the existence of moments, and the sufficient conditions for consistency and asymptotic normality, and computational tractability for very large (that is, ultra high) numbers of financial assets, to present a consistent two step estimation method for the DCC model, and to determine whether BEKK or DCC should be preferred in practical applications.Conditional correlations, Conditional covariances, Diagonal models, Forecasting, Generalized models, Hadamard models, Scalar models, Targeting.
The cut-sky cosmic microwave background is not anomalous
The observed angular correlation function of the cosmic microwave background
has previously been reported to be anomalous, particularly when measured in
regions of the sky uncontaminated by Galactic emission. Recent work by
Efstathiou et al. presents a Bayesian comparison of isotropic theories, casting
doubt on the significance of the purported anomaly. We extend this analysis to
all anisotropic Gaussian theories with vanishing mean ( = 0), using
the much wider class of models to confirm that the anomaly is not likely to
point to new physics. On the other hand if there is any new physics to be
gleaned, it results from low-l alignments which will be better quantified by a
full-sky statistic.
We also consider quadratic maximum likelihood power spectrum estimators that
are constructed assuming isotropy. The underlying assumptions are therefore
false if the ensemble is anisotropic. Nonetheless we demonstrate that, for
theories compatible with the observed sky, these estimators (while no longer
optimal) remain statistically superior to pseudo-C_l power spectrum estimators.Comment: PRD in press. Extremely minor updates, mirroring typographical
changes made in proo
- …