Recurrence network analysis of EEG signals: A Geometric Approach

Abstract

Understanding the neuronal dynamics of dynamical diseases like epilepsy is of fundamental importance. For instance, establishing the presence of deterministic chaos can open up possibilities that can lead to potential medical applications, including timely prevention of seizures. Additionally, understanding the dynamics of interictal activity can greatly aid the localization of epileptic foci without the need for recording seizures. Recurrences, a fundamental property of dynamical systems, are useful for characterizing nonlinear systems. Recurrence networks, which are obtained by reinterpreting the recurrence matrix as an adjacency matrix of a complex network, are useful in characterizing the structural or geometric properties of the underlying system. Recurrence network analysis has established itself as a versatile tool in the field of nonlinear time series analysis and its applicability in investigating neural dynamics remains unexplored. Certain recurrence network measures are particularly sensitive to the presence of unstable periodic orbits (UPOs), which are important for detecting determinism and are the backbone of chaotic attractors.In this thesis, we introduce recurrence network analysis as a tool for nonlinear time series analysis of epileptic electroencephalographic (EEG) signals. We present novel results based on the application of recurrence network analysis combined with surrogate testing to intracranial and extracranial epileptic EEG signals. In addition, using paradigmatic examples of dynamical systems, we present theoretical results exploring the effect of increasing noise levels on recurrence network measures.Using paradigmatic model systems, we first demonstrate that recurrence network measures can distinguish between deterministic (chaos) and stochastic processes, even at short data lengths (≈ 200 samples). In particular, our results from theoretical simulations show that recurrence network measures, particularly transitivity, local clustering coefficient, assortativity, and betweenness centrality can successfully distinguish between deterministic chaotic and stochastic processes (after additional embedding) due to their sensitivity to the presence of UPOs. Our results also show that recurrence network measures like transitivity and average path length are robust against noise and perform better than the Complexity-Entropy plane method at short data lengths. Furthermore, our results show that the effect of noise on the recurrence network measures can be minimized by increasing the recurrence rate.For the analysis of real-world data such as EEG signals, we combined the recurrence network approach with surrogate data to test for the structural complexity in healthy and epileptic EEG signals. Here our results point to an increasing complexity of EEG recordings when moving from healthy to epileptic conditions. Furthermore, we used both univariate network measure and bivariate cross-network measure to distinguish between the structural properties of interictal EEG signals recorded from epileptic and nonepileptic brain areas. Here, our results clearly demonstrated that interictal EEG signals recorded from epileptic areas are more deterministic and interdependent compared to interictal activity recorded from nonepileptic areas. Finally, we show that recurrence network analysis can be applied to uncover the dynamical transitions in neural signals using short segments of data (≈ 150 to 500 samples). To demonstrate this, we used two kinds of neural data - epileptic EEG data and local field potential (LFP) signals recorded during a visuomotor task. We observed that the temporal fluctuations observed in the recurrence network measures are consistent with the dynamical transitions underlying the epileptic and task-based LFP signals.To conclude, recurrence network analysis analysis can capture the complexity in the organization of EEG data in different dynamical states in a more elaborated fashion compared to other approaches such as nonlinear prediction error or correlation dimension. By means of the recurrence network measures, this difference can be assessed not only qualitatively (as when using as tests for nonlinearity), but also quantitatively. Thus, coupled with its ability to operate on short-window sizes and robustness to noise, recurrence network analysis can be a powerful tool to analyze the dynamics of multi-scale neural signals

    Similar works