2,560 research outputs found
Recommended from our members
Can graph-cutting improve microarray gene expression reconstructions?
Microarrays produce high-resolution image data that are, unfortunately, permeated with a great deal of “noise” that must be removed for precision purposes. This paper presents a technique for such a removal process. On completion of this non-trivial task, a new surface (devoid of gene spots) is subtracted from the original to render more precise gene expressions. The graph-cutting technique as implemented has the benefits that only the most appropriate pixels are replaced and these replacements are replicates rather than estimates. This means the influence of outliers and other artifacts are handled more appropriately (than in previous methods) as well as the variability of the final gene expressions being considerably reduced. Experiments are carried out to test the technique against commercial and previously researched reconstruction methods. Final results show that the graph-cutting inspired identification mechanism has a positive significant impact on reconstruction accuracy
EEG-based Brain-Computer Interfaces (BCIs): A Survey of Recent Studies on Signal Sensing Technologies and Computational Intelligence Approaches and Their Applications.
Brain-Computer interfaces (BCIs) enhance the capability of human brain activities to interact with the environment. Recent advancements in technology and machine learning algorithms have increased interest in electroencephalographic (EEG)-based BCI applications. EEG-based intelligent BCI systems can facilitate continuous monitoring of fluctuations in human cognitive states under monotonous tasks, which is both beneficial for people in need of healthcare support and general researchers in different domain areas. In this review, we survey the recent literature on EEG signal sensing technologies and computational intelligence approaches in BCI applications, compensating for the gaps in the systematic summary of the past five years. Specifically, we first review the current status of BCI and signal sensing technologies for collecting reliable EEG signals. Then, we demonstrate state-of-the-art computational intelligence techniques, including fuzzy models and transfer learning in machine learning and deep learning algorithms, to detect, monitor, and maintain human cognitive states and task performance in prevalent applications. Finally, we present a couple of innovative BCI-inspired healthcare applications and discuss future research directions in EEG-based BCI research
Event-related EEG analysis : b simple solutions of complex computations
PhD ThesisThe value of EEG as a non-invasive technique for studying the time course and frequency composition of neuronal signals is well established. However, to date there is still no gold standard methodology for its analysis. Since the introduction of the technique many methodologies for artefact removal and signal isolation have been developed but their performance is often only assessed, against other methodologies, using simulated data with known and controlled artefacts and limited variance. Furthermore, these studies often only address a single stage in the entire analysis pipeline and do not consider the affect different preprocessing techniques might have upon the effectiveness of different signal analysis methodologies.
To address this issue this thesis approaches the assessment of 4 different signal analysis methodologies using real-world-data, from two different stimulus evoked potential studies, and an EEG analysis pipeline that systematically applies and adjusts various preprocessing techniques before subsequent signal analysis. This semi-automated process can be broken down into two stages.
Firstly, multiple configurations of a Preprocessing Optimisation Pipeline (POP) were performed to address three main causes of artefactual noise (1) electrical line noise, (2) non-neuronal potentials (low frequency drifts and muscle artefacts), and (3) ocular artefacts (blinks and saccades). Within the final stages of the POP data quality was assessed for each participant and poorly preprocessed participant datasets were excluded from further analysis based upon either a novel maximum baseline variability threshold criterion or a standard minimum epoch number threshold approach.
Lastly, the data was passed onto a Signal Analysis Pipeline (SAP) which estimated the amplitude of task-specific signals of interest through one of four methodologies (1) grand average informed peak detection (GA-PD), (2) individual average peak detection (IAPD), (3) independent component analysis informed peak detection (ICA-PD) or (4) component of interest peak detection (COIPD). The effectiveness of each of the different preprocessing and signal analysis strategies were then assessed based upon observing the changes within task-specific outcome statistics
AUTOMATED INTERPRETATION OF THE BACKGROUND EEG USING FUZZY LOGIC
A new framework is described for managing uncertainty and for deahng with artefact
corruption to introduce objectivity in the interpretation of the electroencephalogram
(EEG).
Conventionally, EEG interpretation is time consuming and subjective, and is known to
show significant inter- and intra-personnel variation. A need thus exists to automate the
interpretation of the EEG to provide a more consistent and efficient assessment.
However, automated analysis of EEGs by computers is complicated by two major
factors. The difficulty of adequately capturing in machine form, the skills and subjective
expertise of the experienced electroencephalbgrapher, and the lack of a reliable means of
dealing with the range of EEG artefacts (signal contamination). In this thesis, a new
framework is described which introduces objectivity in two important outcomes of
clinical evaluation of the EEG, namely, the clinical factual report and the clinical
'conclusion', by capturing the subjective expertise of the electroencephalographer and
dealing with the problem of artefact corruption.
The framework is separated into two stages .to assist piecewise optimisation and to cater
for different requirements. The first stage, 'quantitative analysis', relies on novel digital
signal processing algorithms and cluster analysis techniques to reduce data and identify
and describe background activities in the EEG. To deal with artefact corruption, an
artefact removal strategy, based on new reUable techniques for artefact identification is
used to ensure that artefact-free activities only are used in the analysis. The outcome is a
quantitative analysis, which efficiently describes the background activity in the record,
and can support future clinical investigations in neurophysiology. In clinical practice,
many of the EEG features are described by the clinicians in natural language terms, such
as very high, extremely irregular, somewhat abnormal etc. The second stage of the
framework, 'qualitative analysis', captures the subjectivity and linguistic uncertainty
expressed.by the clinical experts, using novel, intelligent models, based on fuzzy logic, to
provide an analysis closely comparable to the clinical interpretation made in practice.
The outcome of this stage is an EEG report with qualitative descriptions to complement
the quantitative analysis.
The system was evaluated using EEG records from 1 patient with Alzheimer's disease
and 2 age-matched normal controls for the factual report, and 3 patients with Alzheimer's
disease and 7 age-matched nonnal controls for the 'conclusion'. Good agreement was
found between factual reports produced by the system and factual reports produced by
qualified clinicians. Further, the 'conclusion' produced by the system achieved 100%
discrimination between the two subject groups. After a thorough evaluation, the system
should significantly aid the process of EEG interpretation and diagnosis
Recommended from our members
ABOT: an open-source online benchmarking tool for machine learning-based artefact detection and removal methods from neuronal signals
Brain signals are recorded using different techniques to aid an accurate understanding of brain function and to treat its disorders. Untargeted internal and external sources contaminate the acquired signals during the recording process. Often termed as artefacts, these contaminations cause serious hindrances in decoding the recorded signals; hence, they must be removed to facilitate unbiased decision-making for a given investigation. Due to the complex and elusive manifestation of artefacts in neuronal signals, computational techniques serve as powerful tools for their detection and removal. Machine learning (ML) based methods have been successfully applied in this task. Due to ML’s popularity, many articles are published every year, making it challenging to find, compare and select the most appropriate method for a given experiment. To this end, this paper presents ABOT (Artefact removal Benchmarking Online Tool) as an online benchmarking tool which allows users to compare existing ML-driven artefact detection and removal methods from the literature. The characteristics and related information about the existing methods have been compiled as a knowledgebase (KB) and presented through a user-friendly interface with interactive plots and tables for users to search it using several criteria. Key characteristics extracted from over 120 articles from the literature have been used in the KB to help compare the specific ML models. To comply with the FAIR (Findable, Accessible, Interoperable and Reusable) principle, the source code and documentation of the toolbox have been made available via an open-access repository
Brain Computer Interfaces and Emotional Involvement: Theory, Research, and Applications
This reprint is dedicated to the study of brain activity related to emotional and attentional involvement as measured by Brain–computer interface (BCI) systems designed for different purposes. A BCI system can translate brain signals (e.g., electric or hemodynamic brain activity indicators) into a command to execute an action in the BCI application (e.g., a wheelchair, the cursor on the screen, a spelling device or a game). These tools have the advantage of having real-time access to the ongoing brain activity of the individual, which can provide insight into the user’s emotional and attentional states by training a classification algorithm to recognize mental states. The success of BCI systems in contemporary neuroscientific research relies on the fact that they allow one to “think outside the lab”. The integration of technological solutions, artificial intelligence and cognitive science allowed and will allow researchers to envision more and more applications for the future. The clinical and everyday uses are described with the aim to invite readers to open their minds to imagine potential further developments
Recent Applications in Graph Theory
Graph theory, being a rigorously investigated field of combinatorial mathematics, is adopted by a wide variety of disciplines addressing a plethora of real-world applications. Advances in graph algorithms and software implementations have made graph theory accessible to a larger community of interest. Ever-increasing interest in machine learning and model deployments for network data demands a coherent selection of topics rewarding a fresh, up-to-date summary of the theory and fruitful applications to probe further. This volume is a small yet unique contribution to graph theory applications and modeling with graphs. The subjects discussed include information hiding using graphs, dynamic graph-based systems to model and control cyber-physical systems, graph reconstruction, average distance neighborhood graphs, and pure and mixed-integer linear programming formulations to cluster networks
Brain-Computer Interface
Brain-computer interfacing (BCI) with the use of advanced artificial intelligence identification is a rapidly growing new technology that allows a silently commanding brain to manipulate devices ranging from smartphones to advanced articulated robotic arms when physical control is not possible. BCI can be viewed as a collaboration between the brain and a device via the direct passage of electrical signals from neurons to an external system. The book provides a comprehensive summary of conventional and novel methods for processing brain signals. The chapters cover a range of topics including noninvasive and invasive signal acquisition, signal processing methods, deep learning approaches, and implementation of BCI in experimental problems
- …