61,239 research outputs found
The MVGC multivariate Granger causality toolbox: a new approach to Granger-causal inference
Background: Wiener-Granger causality (“G-causality”) is a statistical notion of causality applicable to time series data, whereby cause precedes, and helps predict, effect. It is defined in both time and frequency domains, and allows for the conditioning out of common causal influences. Originally developed in the context of econometric theory, it has since achieved broad application in the neurosciences and beyond. Prediction in the G-causality formalism is based on VAR (Vector AutoRegressive) modelling.
New Method: The MVGC Matlab c Toolbox approach to G-causal inference is based on multiple equivalent representations of a VAR model by (i) regression parameters, (ii) the autocovariance sequence and (iii) the cross-power spectral density of the underlying process. It features a variety of algorithms for moving between these representations, enabling selection of the most suitable algorithms with regard to computational efficiency and numerical accuracy.
Results: In this paper we explain the theoretical basis, computational strategy and application to empirical G-causal inference of the MVGC Toolbox. We also show via numerical simulations the advantages of our Toolbox over previous methods in terms of computational accuracy and statistical inference.
Comparison with Existing Method(s): The standard method of computing G-causality involves estimation of parameters for both a full and a nested (reduced) VAR model. The MVGC approach, by contrast, avoids explicit estimation of the reduced model, thus eliminating a source of estimation error and improving statistical power, and in addition facilitates fast and accurate estimation of the computationally awkward case of conditional G-causality in the frequency domain.
Conclusions: The MVGC Toolbox implements a flexible, powerful and efficient approach to G-causal inference.
Keywords: Granger causality, vector autoregressive modelling, time series analysi
Constrained structure of ancient Chinese poetry facilitates speech content grouping
Ancient Chinese poetry is constituted by structured language that deviates from ordinary language usage [1, 2]; its poetic genres impose unique combinatory constraints on linguistic elements [3]. How does the constrained poetic structure facilitate speech segmentation when common linguistic [4, 5, 6, 7, 8] and statistical cues [5, 9] are unreliable to listeners in poems? We generated artificial Jueju, which arguably has the most constrained structure in ancient Chinese poetry, and presented each poem twice as an isochronous sequence of syllables to native Mandarin speakers while conducting magnetoencephalography (MEG) recording. We found that listeners deployed their prior knowledge of Jueju to build the line structure and to establish the conceptual flow of Jueju. Unprecedentedly, we found a phase precession phenomenon indicating predictive processes of speech segmentation—the neural phase advanced faster after listeners acquired knowledge of incoming speech. The statistical co-occurrence of monosyllabic words in Jueju negatively correlated with speech segmentation, which provides an alternative perspective on how statistical cues facilitate speech segmentation. Our findings suggest that constrained poetic structures serve as a temporal map for listeners to group speech contents and to predict incoming speech signals. Listeners can parse speech streams by using not only grammatical and statistical cues but also their prior knowledge of the form of language
On a class of minimum contrast estimators for Gegenbauer random fields
The article introduces spatial long-range dependent models based on the
fractional difference operators associated with the Gegenbauer polynomials. The
results on consistency and asymptotic normality of a class of minimum contrast
estimators of long-range dependence parameters of the models are obtained. A
methodology to verify assumptions for consistency and asymptotic normality of
minimum contrast estimators is developed. Numerical results are presented to
confirm the theoretical findings.Comment: 23 pages, 8 figure
Nonfractional Memory: Filtering, Antipersistence, and Forecasting
The fractional difference operator remains to be the most popular mechanism
to generate long memory due to the existence of efficient algorithms for their
simulation and forecasting. Nonetheless, there is no theoretical argument
linking the fractional difference operator with the presence of long memory in
real data. In this regard, one of the most predominant theoretical explanations
for the presence of long memory is cross-sectional aggregation of persistent
micro units. Yet, the type of processes obtained by cross-sectional aggregation
differs from the one due to fractional differencing. Thus, this paper develops
fast algorithms to generate and forecast long memory by cross-sectional
aggregation. Moreover, it is shown that the antipersistent phenomenon that
arises for negative degrees of memory in the fractional difference literature
is not present for cross-sectionally aggregated processes. Pointedly, while the
autocorrelations for the fractional difference operator are negative for
negative degrees of memory by construction, this restriction does not apply to
the cross-sectional aggregated scheme. We show that this has implications for
long memory tests in the frequency domain, which will be misspecified for
cross-sectionally aggregated processes with negative degrees of memory.
Finally, we assess the forecast performance of high-order and
models when the long memory series are generated by cross-sectional
aggregation. Our results are of interest to practitioners developing forecasts
of long memory variables like inflation, volatility, and climate data, where
aggregation may be the source of long memory
Space exploration: The interstellar goal and Titan demonstration
Automated interstellar space exploration is reviewed. The Titan demonstration mission is discussed. Remote sensing and automated modeling are considered. Nuclear electric propulsion, main orbiting spacecraft, lander/rover, subsatellites, atmospheric probes, powered air vehicles, and a surface science network comprise mission component concepts. Machine, intelligence in space exploration is discussed
Long memory and volatility dynamics in the US Dollar exchange rate
This paper focuses on nominal exchange rates, specifically the US dollar rate vis-à-vis the Euro and the Japanese Yen at a daily frequency. We model both absolute values of
returns and squared returns using long-memory techniques, being particularly interested in volatility modelling and forecasting given their importance for FOREX dealers. Compared with previous studies using a standard fractional integration framework such as Granger and Ding (1996), we estimate a more general model which allows for dependence not only at the zero but also at other frequencies. The results show differences in the behaviour of the two series: a long-memory cyclical model and a
standard I(d) model seem to be the most appropriate for the US dollar rate vis-à-vis the Euro and the Japanese Yen respectively
Selection of the number of frequencies using bootstrap techniques in log-periodogram regression
The choice of the bandwidth in the local log-periodogram regression is of crucial importance for estimation of the memory parameter of a long memory time series. Different choices may give rise to completely different estimates, which may lead to contradictory conclusions, for example about the stationarity of the series. We propose here a data driven bandwidth selection strategy that is based on minimizing a bootstrap approximation of the mean squared error and compare its performance with other existing techniques for optimal bandwidth selection in a mean squared error sense, revealing its better performance in a wider class of models. The empirical applicability of the proposed strategy is shown with two examples: the widely analyzed in a long memory context Nile river annual minimum levels and the input gas rate series of Box and Jenkins.bootstrap, long memory, log-periodogram regression, bandwidth selection
- …