539 research outputs found

    Review of analytical instruments for EEG analysis

    Full text link
    Since it was first used in 1926, EEG has been one of the most useful instruments of neuroscience. In order to start using EEG data we need not only EEG apparatus, but also some analytical tools and skills to understand what our data mean. This article describes several classical analytical tools and also new one which appeared only several years ago. We hope it will be useful for those researchers who have only started working in the field of cognitive EEG

    On the methodological unification in electroencephalography

    Get PDF
    BACKGROUND: This paper presents results of a pursuit of a repeatable and objective methodology of analysis of the electroencephalographic (EEG) time series. METHODS: Adaptive time-frequency approximations of EEG are discussed in the light of the available experimental and theoretical evidence, and applicability in various experimental and clinical setups. RESULTS: Four lemmas and three conjectures support the following conclusion. CONCLUSION: Adaptive time-frequency approximations of signals unify most of the univariate computational approaches to EEG analysis, and offer compatibility with its traditional (visual) analysis, used in clinical applications

    P300-Based BCI Mouse With Genetically-Optimized Analogue Control

    Get PDF
    In this paper we propose a brain-computer interface (BCI) mouse based on P300 waves in electroencephalogram (EEG) signals. The system is analogue in that at no point a binary decision is made as to whether or not a P300 was actually produced in response to the stimuli. Instead, the 2-D motion of the pointer on the screen, using a novel BCI paradigm, is controlled by directly combining the amplitudes of the output produced by a filter in the presence of different stimuli. This filter and the features to be combined within it are optimised by an evolutionary algorithm. © 2006 IEEE

    A maximum likelihood based technique for validating detrended fluctuation analysis (ML-DFA)

    Get PDF
    Detrended Fluctuation Analysis (DFA) is widely used to assess the presence of long-range temporal correlations in time series. Signals with long-range temporal correlations are typically defined as having a power law decay in their autocorrelation function. The output of DFA is an exponent, which is the slope obtained by linear regression of a log-log fluctuation plot against window size. However, if this fluctuation plot is not linear, then the underlying signal is not self-similar, and the exponent has no meaning. There is currently no method for assessing the linearity of a DFA fluctuation plot. Here we present such a technique, called ML-DFA. We scale the DFA fluctuation plot to construct a likelihood function for a set of alternative models including polynomial, root, exponential, logarithmic and spline functions. We use this likelihood function to determine the maximum likelihood and thus to calculate values of the Akaike and Bayesian information criteria, which identify the best fit model when the number of parameters involved is taken into account and over-fitting is penalised. This ensures that, of the models that fit well, the least complicated is selected as the best fit. We apply ML-DFA to synthetic data from FARIMA processes and sine curves with DFA fluctuation plots whose form has been analytically determined, and to experimentally collected neurophysiological data. ML-DFA assesses whether the hypothesis of a linear fluctuation plot should be rejected, and thus whether the exponent can be considered meaningful. We argue that ML-DFA is essential to obtaining trustworthy results from DFA.Comment: 22 pages, 7 figure

    The Surface Laplacian Technique in EEG: Theory and Methods

    Full text link
    This paper reviews the method of surface Laplacian differentiation to study EEG. We focus on topics that are helpful for a clear understanding of the underlying concepts and its efficient implementation, which is especially important for EEG researchers unfamiliar with the technique. The popular methods of finite difference and splines are reviewed in detail. The former has the advantage of simplicity and low computational cost, but its estimates are prone to a variety of errors due to discretization. The latter eliminates all issues related to discretization and incorporates a regularization mechanism to reduce spatial noise, but at the cost of increasing mathematical and computational complexity. These and several others issues deserving further development are highlighted, some of which we address to the extent possible. Here we develop a set of discrete approximations for Laplacian estimates at peripheral electrodes and a possible solution to the problem of multiple-frame regularization. We also provide the mathematical details of finite difference approximations that are missing in the literature, and discuss the problem of computational performance, which is particularly important in the context of EEG splines where data sets can be very large. Along this line, the matrix representation of the surface Laplacian operator is carefully discussed and some figures are given illustrating the advantages of this approach. In the final remarks, we briefly sketch a possible way to incorporate finite-size electrodes into Laplacian estimates that could guide further developments.Comment: 43 pages, 8 figure

    Resting state MEG oscillations show long-range temporal correlations of phase synchrony that break down during finger movement

    Get PDF
    The capacity of the human brain to interpret and respond to multiple temporal scales in its surroundings suggests that its internal interactions must also be able to operate over a broad temporal range. In this paper, we utilize a recently introduced method for characterizing the rate of change of the phase difference between MEG signals and use it to study the temporal structure of the phase interactions between MEG recordings from the left and right motor cortices during rest and during a finger-tapping task. We use the Hilbert transform to estimate moment-to-moment fluctuations of the phase difference between signals. After confirming the presence of scale-invariance we estimate the Hurst exponent using detrended fluctuation analysis (DFA). An exponent of >0.5 is indicative of long-range temporal correlations (LRTCs) in the signal. We find that LRTCs are present in the α/μ and β frequency bands of resting state MEG data. We demonstrate that finger movement disrupts LRTCs correlations, producing a phase relationship with a structure similar to that of Gaussian white noise. The results are validated by applying the same analysis to data with Gaussian white noise phase difference, recordings from an empty scanner and phase-shuffled time series. We interpret the findings through comparison of the results with those we obtained from an earlier study during which we adopted this method to characterize phase relationships within a Kuramoto model of oscillators in its sub-critical, critical, and super-critical synchronization states. We find that the resting state MEG from left and right motor cortices shows moment-to-moment fluctuations of phase difference with a similar temporal structure to that of a system of Kuramoto oscillators just prior to its critical level of coupling, and that finger tapping moves the system away from this pre-critical state toward a more random state

    Mean field modelling of human EEG: application to epilepsy

    Get PDF
    Aggregated electrical activity from brain regions recorded via an electroencephalogram (EEG), reveal that the brain is never at rest, producing a spectrum of ongoing oscillations that change as a result of different behavioural states and neurological conditions. In particular, this thesis focusses on pathological oscillations associated with absence seizures that typically affect 2–16 year old children. Investigation of the cellular and network mechanisms for absence seizures studies have implicated an abnormality in the cortical and thalamic activity in the generation of absence seizures, which have provided much insight to the potential cause of this disease. A number of competing hypotheses have been suggested, however the precise cause has yet to be determined. This work attempts to provide an explanation of these abnormal rhythms by considering a physiologically based, macroscopic continuum mean-field model of the brain's electrical activity. The methodology taken in this thesis is to assume that many of the physiological details of the involved brain structures can be aggregated into continuum state variables and parameters. The methodology has the advantage to indirectly encapsulate into state variables and parameters, many known physiological mechanisms underlying the genesis of epilepsy, which permits a reduction of the complexity of the problem. That is, a macroscopic description of the involved brain structures involved in epilepsy is taken and then by scanning the parameters of the model, identification of state changes in the system are made possible. Thus, this work demonstrates how changes in brain state as determined in EEG can be understood via dynamical state changes in the model providing an explanation of absence seizures. Furthermore, key observations from both the model and EEG data motivates a number of model reductions. These reductions provide approximate solutions of seizure oscillations and a better understanding of periodic oscillations arising from the involved brain regions. Local analysis of oscillations are performed by employing dynamical systems theory which provide necessary and sufficient conditions for their appearance. Finally local and global stability is then proved for the reduced model, for a reduced region in the parameter space. The results obtained in this thesis can be extended and suggestions are provided for future progress in this area

    Feature extraction with GMDH-type neural networks for EEG-based person identification

    Get PDF
    The brain activity observed on EEG electrodes is influenced by volume conduction and functional connectivity of a person performing a task. When the task is a biometric test the EEG signals represent the unique “brain print”, which is defined by the functional connectivity that is represented by the interactions between electrodes, whilst the conduction components cause trivial correlations. Orthogonalization using autoregressive modeling minimizes the conduction components, and then the residuals are related to features correlated with the functional connectivity. However, the orthogonalization can be unreliable for high-dimensional EEG data. We have found that the dimensionality can be significantly reduced if the baselines required for estimating the residuals can be modeled by using relevant electrodes. In our approach, the required models are learnt by a Group Method of Data Handling (GMDH) algorithm which we have made capable of discovering reliable models from multidimensional EEG data. In our experiments on the EEG-MMI benchmark data which include 109 participants, the proposed method has correctly identified all the subjects and provided a statistically significant (p<0.01) improvement of the identification accuracy. The experiments have shown that the proposed GMDH method can learn new features from multi-electrode EEG data, which are capable to improve the accuracy of biometric identification
    • …
    corecore