11 research outputs found

    Validating and improving the correction of ocular artifacts in electro-encephalography

    Get PDF
    For modern applications of electro-encephalography, including brain computer interfaces and single-trial Event Related Potential detection, it is becoming increasingly important that artifacts are accurately removed from a recorded electro-encephalogram (EEG) without affecting the part of the EEG that reflects cerebral activity. Ocular artifacts are caused by movement of the eyes and the eyelids. They occur frequently in the raw EEG and are often the most prominent artifacts in EEG recordings. Their accurate removal is therefore an important procedure in nearly all electro-encephalographic research. As a result of this, a considerable number of ocular artifact correction methods have been introduced over the past decades. A selection of these methods, which contains some of the most frequently used correction methods, is given in Section 1.5. When two different correction methods are applied to the same raw EEG, this usually results in two different corrected EEGs. A measure for the accuracy of correction should indicate how well each of these corrected EEGs recovers the part of the raw EEG that truly reflects cerebral activity. The fact that this accuracy cannot be determined directly from a raw EEG is intrinsic to the need for artifact removal. If, based on a raw EEG, it would be possible to derive an exact reference on what the corrected EEG should be, then there would not be any need for adequate artifact correction methods. Estimating the accuracy of correction methods is mostly done either by using models to simulate EEGs and artifacts, or by manipulating the experimental data in such a way that the effects of artifacts to the raw EEG can be isolated. In this thesis, modeling of EEG and artifact is used to validate correction methods based on simulated data. A new correction method is introduced which, unlike all existing methods, uses a camera to monitor eye(lid) movements as a basis for ocular artifact correction. The simulated data is used to estimate the accuracy of this new correction method and to compare it against the estimated accuracy of existing correction methods. The results of this comparison suggest that the new method significantly increases correction accuracy compared to the other methods. Next, an experiment is performed, based on which the accuracy of correction can be estimated on raw EEGs. Results on this experimental data comply very well with the results on the simulated data. It is therefore concluded that using a camera during EEG recordings provides valuable extra information that can be used in the process of ocular artifact correction. In Chapter 2, a model is introduced that assists in estimating the accuracy of eye movement artifacts for simulated EEG recordings. This model simulates EEG and eye movement artifacts simultaneously. For this, the model uses a realistic representation of the head, multiple dipoles to model cerebral and ocular electrical activity, and the boundary element method to calculate changes in electrical potential at different positions on the scalp. With the model, it is possible to simulate different data sets as if they are recorded using different electrode configurations. Signal to noise ratios are used to assess the accuracy of these six correction methods for various electrode configurations before and after applying six different correction methods. Results show that out of the six methods, second order blind identification, SOBI, and multiple linear regression, MLR, correct most accurately overall as they achieve the highest rise in signal to noise ratio. The occurrence of ocular artifacts is linked to changes in eyeball orientation. In Chapter 2 an eye tracker is used to record pupil position, which is closely linked to eyeball orientation. The pupil position information is used in the model to simulate eye movements. Recognizing the potential benefit of using an eye tracker not only for simulations, but also for correction, Chapter 3 introduces an eye movement artifact correction method that exploits the pupil position information that is provided by an eye tracker. Other correction methods use the electrooculogram (EOG) and/or the EEG to estimate ocular artifacts. Because both the EEG and the EOG recordings are susceptive to cerebral activity as well as to ocular activity, these other methods are at risk of overcorrecting the raw EEG. Pupil position information provides a reference that is linked to the ocular artifact in the EEG but that cannot be affected by cerebral activity, and as a result the new correction method avoids having to solve traditionally problematic issues like forward/backward propagation and evaluating the accuracy of component extraction. By using both simulated and experimental data, it is determined how pupil position influences the raw EEG and it is found that this relation is linear or quadratic. A Kalman filter is used for tuning of the parameters that specify the relation. On simulated data, the new method performs very well, resulting in an SNR after correction of over 10 dB for various patterns of eye movements. When compared to the three methods that performed best in the evaluation of Chapter 2, only the SOBI method which performed best in that evaluation shows similar results for some of the eye movement patterns. However, a serious limitation of the correction method is its inability to correct blink artifacts. In order to increase the variety of applications for which the new method can be used, the new correction should be improved in a way that enables it to correct the raw EEG for blinking artifacts. Chapter 4 deals with implementing such improvements based on the idea that a more advanced eye-tracker should be able to detect both the pupil position and the eyelid position. The improved eye tracker-based ocular artifact correction method is named EYE. Driven by some practical limitations regarding the eye tracking device currently available to us, an alternative way to estimate eyelid position is suggested, based on an EOG recorded above one eye. The EYE method can be used with both the eye tracker information or with the EOG substitute. On simulated data, accuracy of the EYE method is estimated using the EOGbased eyelid reference. This accuracy is again compared against the six other correction methods. Two different SNR-based measures of accuracy are proposed. One of these quantifies the correction of the entire simulated data set and the other focuses on those segments containing simulated blinking artifacts. After applying EYE, an average SNR of at least 9 dB for both these measures is achieved. This implies that the power of the corrected signal is at least eight times the power of the remaining noise. The simulated data sets contain a wide range of eye movements and blink frequencies. For almost all of these data sets, 16 out of 20, the correction results for EYE are better than for any of the other evaluated correction method. On experimental data, the EYE method appears to adequately correct for ocular artifacts as well. As the detection of eyelid position from the EOG is in principle inferior to the detection of eyelid position with the use of an eye tracker, these results should also be considered as an indicator of even higher accuracies that could be obtained with a more advanced eye tracker. Considering the simplicity of the MLR method, this method also performs remarkably well, which may explain why EOG-based regression is still often used for correction. In Chapter 5, the simulation model of Chapter 2 is put aside and, alternatively, experimentally recorded data is manipulated in a way that correction inaccuracies can be highlighted. Correction accuracies of eight correction methods, including EYE, are estimated based on data that are recorded during stop-signal tasks. In the analysis of these tasks it is essential that ocular artifacts are adequately removed because the task-related ERPs, are located mostly at frontal electrode positions and are low-amplitude. These data are corrected and subsequently evaluated. For the eight methods, the overall ranking of estimated accuracy in Figure 5.3, corresponds very well with the correction accuracy of these methods on simulated data as was found in Chapter 4. In a single-trial correction comparison, results suggest that the EYE corrected EEG, is not susceptible to overcorrection, whereas the other corrected EEGs are

    Different methods to define utility functions yield different results and engage different neural processes

    Get PDF
    Although the concept of utility is fundamental to many economic theories, up to now a generally accepted method determining a subject’s utility function is not available. We investigated two methods that are used in economic sciences for describing utility functions by using response-locked event-related potentials in order to assess their neural underpinnings. For defining the certainty equivalent (CE), we used a lottery game with probabilities of 0.5, for identifying the subjects’ utility functions directly a standard bisection task was applied. Although the lottery tasks’ payoffs were only hypothetical, a pronounced negativity was observed resembling the error related negativity (ERN) previously described in action monitoring research, but this occurred only for choices far away from the indifference point between money and lottery. By contrast, the bisection task failed to evoke an ERN irrespective of the responses’ correctness. Based on these findings we are reasoning that only decisions made in the lottery task achieved a level of subjective relevance that activates cognitive-emotional monitoring. In terms of economic sciences, our findings support the view that the bisection method is unaffected by any kind of probability valuation or other parameters related to risk and in combination with the lottery task can, therefore, be used to differentiate between payoff and probability valuation.Utility function; neuroeconomics; error-related negativity; executive functions; cognitive electrophysiology; lottery,bisection

    Different methods to define utility functions yield different results and engage different neural processes

    Get PDF
    Although the concept of utility is fundamental to many economic theories, up to now a generally accepted method determining a subject\u27s utility function is not available. We investigated two methods that are used in economic sciences for describing utility functions by using response-locked event-related potentials in order to assess their neural underpinnings. For defining the certainty equivalent (CE), we used a lottery game with probabilities of 0.5, for identifying the subjects\u27 utility functions directly a standard bisection task was applied. Although the lottery tasks\u27 payoffs were only hypothetical, a pronounced negativity was observed resembling the error related negativity (ERN) previously described in action monitoring research, but this occurred only for choices far away from the indifference point between money and lottery. By contrast, the bisection task failed to evoke an ERN irrespective of the responses\u27 correctness. Based on these findings we are reasoning that only decisions made in the lottery task achieved a level of subjective relevance that activates cognitive-emotional monitoring. In terms of economic sciences, our findings support the view that the bisection method is unaffected by any kind of probability valuation or other parameters related to risk and in combination with the lottery task can, therefore, be used to differentiate between payoff and probability valuation

    Different Methods to Define Utility Functions Yield Similar Results but Engage Different Neural Processes

    Get PDF
    Although the concept of utility is fundamental to many economic theories, up to now a generally accepted method determining a subject's utility function is not available. We investigated two methods that are used in economic sciences for describing utility functions by using response-locked event-related potentials in order to assess their neural underpinnings. For determining the certainty equivalent, we used a lottery game with probabilities to win p = 0.5, for identifying the subjects’ utility functions directly a standard bisection task was applied. Although the lottery tasks’ payoffs were only hypothetical, a pronounced negativity was observed resembling the error related negativity (ERN) previously described in action monitoring research, but this occurred only for choices far away from the indifference point between money and lottery. By contrast, the bisection task failed to evoke an remarkable ERN irrespective of the responses’ correctness. Based on these findings we are reasoning that only decisions made in the lottery task achieved a level of subjective relevance that activates cognitive-emotional monitoring. In terms of economic sciences, our findings support the view that the bisection method is unaffected by any kind of probability valuation or other parameters related to risk and in combination with the lottery task can, therefore, be used to differentiate between payoff and probability valuation

    METODY ELIMINACJI ARTEFAKTÓW W SYGNAŁACH EEG

    Get PDF
    Registration of electroencephalography signals (EEG) is almost always associated with recording different kinds of artifacts that makes it difficult to read and analyze collected data. These artifacts may be noticeable in the individual channels, but very often they have to be adjusted over several channels simultaneously. Their origin can be varied. Among the most typical are network and hardware artifacts as well as several types of muscle artifacts, derived from the tested person. In recent years increased interest in EEG studies might be noticed. EEG signals are applied not only in the outpatient and clinical applications, but also in psychological analyses and in construction of modern human-machine interfaces. This article presents a case study of classification analysis application in EEG artifact correction tasks.Rejestracja sygnałów elektroencefalograficznych (EEG) jest niemal zawsze związana z zapisem różnego rodzaju artefaktów, które zaszumianą odczyt i utrudniają analizę zebranych danych. Artefakty te mogą być zauważalne w pojedynczych kanałach, ale bardzo często muszą być korygowane na przestrzeni kilku kanałów jednocześnie. Ich pochodzenie może być różnorodne. Wyróżnia się artefakty sieciowe, sprzętowe jak również kilka rodzajów artefaktów mięśniowych, pochodzących od badanej osoby. W ostatnich latach obserwuje się wzrost zainteresowania badaniami EEG nie tylko w zastosowaniach ambulatoryjnych i klinicznych, ale także w analizach psychologicznych oraz w budowie nowoczesnych interfejsów człowiek-maszyna. Artykuł przedstawia studium przypadku zastosowania analiz klasyfikacyjnych w zagadnieniach korekcji artefaktów sygnału EEG

    Enhancement of Eeg Signal

    Get PDF
    This project is concerned with the rectification of EEG recording. EEG signal is often gets distorted due to the presence of various signals which are known as artifacts. Eye blinking is one of the major artifacts causing EEG to distort. Eye blinking distorts the EEG signal by varying the electric potential present over the scalp. To remove the artifacts, signal separation techniques are widely used in modern days. There are various methods used for removing different types of artifacts present in EEG recording and one of the techniques is Blind Source Separation which is used for separation of source signal from artifacts.This thesis also demonstrates the use of Second Order Blind Identification with Robust Orthogonalization (known as SOBI-RO) algorithm to remove the ocular artifacts and reconstruct the original EEG signal. Finally, the original signal and estimated signal is compared. To illustrate the algorithm a raw EEG data has been taken from the database. The data has been processed on MATLAB platform using the SOBI-RO algorithm. In the end it was found that the ocular artifacts are successfully removed from the raw EEG data. The performance is evaluated using signal to distortion ratio

    Small-World Network Analysis of Cortical Connectivity in Chronic Fatigue Syndrome using EEG

    Get PDF
    The primary aim of this thesis was to explore the relationship between electroencephalography (qEEG) and brain system dysregulation in people with Chronic Fatigue Syndrome (CFS). EEG recordings were taken from an archival dataset of 30 subjects, 15 people with CFS and 15 healthy controls (HCs), evaluated during an eye-closed resting state condition. Exact low resolution electromagnetic tomography (eLORETA) was applied to the qEEG data to estimate cortical sources and perform functional connectivity analysis assessing the strength of time-varying signals between all pairwise cortical regions of interest. To obtain a comprehensive view of local and global processing, eLORETA lagged coherence was computed on 84 regions of interest representing 42 Brodmann areas for the left and right hemispheres of the cortex, for the delta (1-3 Hz) and alpha-1 (8-10 Hz) and alpha-2 (10-12 Hz) frequency bands. Graph theory analysis of eLORETA coherence matrices for each participant was conducted to derive the “small-worldness” index, a measure of the optimal balance between the functional integration (global) and segregation (local) properties known to be present in brain networks. The data were also associated with the cognitive impairment composite score on the DePaul Symptom Questionnaire (DSQ), a patient-reported symptom outcome measure of frequency and severity of cognitive symptoms. Results showed that small-worldness for the delta band was significantly lower for patients with CFS compared to HCs. Small-worldness for delta, alpha-1, and alpha-2 were associated with higher cognitive composite scores on the DSQ. Finally, small-worldness in all 3 frequency bands correctly distinguished those with CFS from HCS with a classification rate of nearly 87 percent. These preliminary findings suggest disease processes in CFS may be functionally disruptive to small-world characteristics, especially in the delta frequency band, resulting in cognitive impairments. In turn, these findings may help to confirm a biological basis for cognitive symptoms, providing clinically relevant diagnostic indicators, and characterizing the neurophysiological status of people with CFS

    Cognitive control in adults with high-functioning autism spectrum disorder: a study with event-related potentials

    Get PDF
    IntroductionLittle is known about cognitive control in adults with high-functioning forms of autism spectrum disorder because previous research focused on children and adolescents. Cognitive control is crucial to monitor and readjust behavior after errors to select contextually appropriate reactions. The congruency effect and conflict adaptation are measures of cognitive control. Post-error slowing, error-related negativity and error positivity provide insight into behavioral and electrophysiological correlates of error processing. In children and adolescent with autism spectrum disorder deficits in cognitive control and error processing have been shown by changes in post-error slowing, error-related negativity and error positivity in the flanker task.MethodsWe performed a modified Eriksen flanker task in 17 adults with high-functioning autism spectrum disorder and 17 healthy controls. As behavioral measures of cognitive control and error processing, we included reaction times and error rates to calculate congruency effects, conflict adaptation, and post-error slowing. Event-related potentials namely error-related negativity and error positivity were measured to assess error-related brain activity.ResultsBoth groups of participants showed the expected congruency effects demonstrated by faster and more accurate responses in congruent compared to incongruent trials. Healthy controls exhibited conflict adaptation as they obtained performance benefits after incongruent trials whereas patients with autism spectrum disorder did not. The expected slowing in reaction times after errors was observed in both groups of participants. Individuals with autism spectrum disorder demonstrated enhanced electrophysiological error-processing compared to healthy controls indicated by increased error-related negativity and error positivity difference amplitudes.DiscussionOur findings show that adults with high-functioning autism spectrum disorder do not show the expected upregulation of cognitive control in response to conflicts. This finding implies that previous experiences may have a reduced influence on current behavior in these patients which possibly contributes to less flexible behavior. Nevertheless, we observed intact behavioral reactions after errors indicating that adults with high-functioning autism spectrum disorder can flexibly adjust behavior in response to changed environmental demands when necessary. The enhancement of electrophysiological error-processing indicates that adults with high-functioning autism spectrum disorder demonstrate an extraordinary reactivity toward errors reflecting increased performance monitoring in this subpopulation of autism spectrum disorder patients

    A model-based objective evaluation of eye movement correction in EEG recordings

    No full text
    We present a method to quantitatively and objectively compare algorithms for correction of eye movement artifacts in a simulated ongoing electroencephalographic signal (EEG). A realistic model of the human head is used, together with eye tracker data, to generate a data set in which potentials of ocular and cerebral origin are simulated. This approach bypasses the common problem of brain-potential contaminated electro-oculographic signals (EOGs), when monitoring or simulating eye movements. The data are simulated for five different EEG electrode configurations combined with four different EOG electrode configurations. In order to objectively compare correction performance for six algorithms, listed in Table III, we determine the signal to noise ratio of the EEG before and after artifact correction. A score indicating correction performance is derived, and for each EEG configuration the optimal correction algorithm and the optimal number of EOG electrodes are determined. In general, the second-order blind identification correction algorithm in combination with 6 EOG electrodes performs best for all EEG configurations evaluated on the simulated data

    Decomposition and classification of electroencephalography data

    Get PDF
    corecore