2 research outputs found

    A Dry EEG-System for Scientific Research and Brain–Computer Interfaces

    Get PDF
    Although it ranks among the oldest tools in neuroscientific research, electroencephalography (EEG) still forms the method of choice in a wide variety of clinical and research applications. In the context of brain–computer interfacing (BCI), EEG recently has become a tool to enhance human–machine interaction. EEG could be employed in a wider range of environments, especially for the use of BCI systems in a clinical context or at the homes of patients. However, the application of EEG in these contexts is impeded by the cumbersome preparation of the electrodes with conductive gel that is necessary to lower the impedance between electrodes and scalp. Dry electrodes could provide a solution to this barrier and allow for EEG applications outside the laboratory. In addition, dry electrodes may reduce the time needed for neurological exams in clinical practice. This study evaluates a prototype of a three-channel dry electrode EEG system, comparing it to state-of-the-art conventional EEG electrodes. Two experimental paradigms were used: first, event-related potentials (ERP) were investigated with a variant of the oddball paradigm. Second, features of the frequency domain were compared by a paradigm inducing occipital alpha. Furthermore, both paradigms were used to evaluate BCI classification accuracies of both EEG systems. Amplitude and temporal structure of ERPs as well as features in the frequency domain did not differ significantly between the EEG systems. BCI classification accuracies were equally high in both systems when the frequency domain was considered. With respect to the oddball classification accuracy, there were slight differences between the wet and dry electrode systems. We conclude that the tested dry electrodes were capable to detect EEG signals with good quality and that these signals can be used for research or BCI applications. Easy to handle electrodes may help to foster the use of EEG among a wider range of potential users

    The PREP Pipeline: Standardized preprocessing for large-scale EEG analysis

    Get PDF
    The technology to collect brain imaging and physiological measures has become portable and ubiquitous, opening the possibility of large-scale analysis of real-world human imaging. By its nature, such data is large and complex, making automated processing essential. This paper shows how lack of attention to the very early stages of an EEG preprocessing pipeline can reduce the signal-to-noise ratio and introduce unwanted artifacts into the data, particularly for computations done in single precision. We demonstrate that ordinary average referencing improves the signal-to-noise ratio, but that noisy channels can contaminate the results. We also show that identification of noisy channels depends on the reference and examine the complex interaction of filtering, noisy channel identification, and referencing. We introduce a multi-stage robust referencing scheme to deal with the noisy channel-reference interaction. We propose a standardized early-stage EEG processing pipeline (PREP) and discuss the application of the pipeline to more than 600 EEG datasets. The pipeline includes an automatically generated report for each dataset processed. Users can download the PREP pipeline as a freely available MATLAB library from http://eegstudy.org/prepcode/
    corecore