115 research outputs found

    Investigation of GPGPU for use in processing of EEG in real-time

    Get PDF
    The purpose of this thesis was to investigate the use of General Purpose computing on Graphics Processing Units (GPGPU) to process electroencephalogram (EEG) signals in real-time. The main body of this work required the implementation of Independent Component Analysis were investigated: FastICA and JADE. Both were implemented three times: first using M-file syntax to serve as a benchmark, next, as native C code to measure performance of the algorithms when running natively on a CPU, and finally, as GPGPU code using the NVIDIA CUDA C language extension. In previous works, Independent Component Analysis represented the largest roadblock to achieving the real-time goal of processing 10 seconds of EEG within a 10 second window. It was found that both FastICA and JADE see speedups, with a maximum measured speedup of approximately 6x for FastICA, and approximately 2.5x for JADE, when operating on the largest datasets. In addition, speedups of between 1x and 2x were seen when working on datasets of the expected size provided by 10 seconds of 32-channel EEG sampled at 500 Hz. However, it was also found that GPGPU solutions are not necessary for real-time performance on a modern desktop computer as the FastICA algorithm is capable of a worst-case performance of between approximately 1 and 2 seconds depending on configuration parameters

    Hybrid implementation of the fastICA algorithm for high-density EEG using the capabilities of the Intel architecture and CUDA programming

    Get PDF
    High-density electroencephalographic (EEG) systems are utilized in the study of the human brain and its underlying behaviors. However, working with EEG data requires a well-cleaned signal, which is often achieved through the use of independent component analysis (ICA) methods. The calculation time for these types of algorithms is the longer the more data we have. This article presents a hybrid implementation of the fastICA algorithm that uses parallel programming techniques (libraries and extensions of the Intel processors and CUDA programming), which results in a significant acceleration of execution time on selected architectures

    A Computational Framework to Support the Automated Analysis of Routine Electroencephalographic Data

    Get PDF
    Epilepsy is a condition in which a patient has multiple unprovoked seizures which are not precipitated by another medical condition. It is a common neurological disorder that afflicts 1% of the population of the US, and is sometimes hard to diagnose if seizures are infrequent. Routine Electroencephalography (rEEG), where the electrical potentials of the brain are recorded on the scalp of a patient, is one of the main tools for diagnosing because rEEG can reveal indicators of epilepsy when patients are in a non-seizure state. Interpretation of rEEG is difficult and studies have shown that 20-30% of patients at specialized epilepsy centers are misdiagnosed. An improved ability to interpret rEEG could decrease the misdiagnosis rate of epilepsy. The difficulty in diagnosing epilepsy from rEEG stems from the large quantity, low signal to noise ratio (SNR), and variability of the data. A usual point of error for a clinician interpreting rEEG data is the misinterpretation of PEEs (paroxysmal EEG events) ( short bursts of electrical activity of high amplitude relative to the surrounding signals that have a duration of approximately .1 to 2 seconds). Clinical interpretation of PEEs could be improved with the development of an automated system to detect and classify PEE activity in an rEEG dataset. Systems that have attempted to automatically classify PEEs in the past have had varying degrees of success. These efforts have been hampered to a large extent by the absence of a \gold standard\u27 data set that EEG researchers could use. In this work we present a distributed, web-based collaborative system for collecting and creating a gold standard dataset for the purpose of evaluating spike detection software. We hope to advance spike detection research by creating a performance standard that facilitates comparisons between approaches of disparate research groups. Further, this work endeavors to create a new, high performance parallel implementation of ICA (independent component analysis), a potential preprocessing step for PEE classification. We also demonstrate tools for visualization and analysis to support the initial phases of spike detection research. These tools will first help to develop a standardized rEEG dataset of expert EEG interpreter opinion with which automated analysis can be trained and tested. Secondly, it will attempt to create a new framework for interdisciplinary research that will help improve our understanding of PEEs in rEEG. These improvements could ultimately advance the nuanced art of rEEG interpretation and decrease the misdiagnosis rate that leads to patients suering inappropriate treatment

    Independent Component Analysis for Improved Defect Detection in Guided Wave Monitoring

    Get PDF
    Guided wave sensors are widely used in a number of industries and have found particular application in the oil and gas industry for the inspection of pipework. Traditionally this type of sensor was used for one-off inspections, but in recent years there has been a move towards permanent installation of the sensor. This has enabled highly repeatable readings of the same section of pipe, potentially allowing improvements in defect detection and classification. This paper proposes a novel approach using independent component analysis to decompose repeat guided wave signals into constituent independent components. This separates the defect from coherent noise caused by changing environmental conditions, improving detectability. This paper demonstrates independent component analysis applied to guided wave signals from a range of industrial inspection scenarios. The analysis is performed on test data from pipe loops that have been subject to multiple temperature cycles both in undamaged and damaged states. In addition to processing data from experimental damaged conditions, simulated damage signals have been added to “undamaged” experimental data, so enabling multiple different damage scenarios to be investigated. The algorithm has also been used to process guided wave signals from finite element simulations of a pipe with distributed shallow general corrosion, within which there is a patch of severe corrosion. In all these scenarios, the independent component analysis algorithm was able to extract the defect signal, rejecting coherent noise

    Sparse Encoding of Binocular Images for Depth Inference

    Get PDF
    Sparse coding models have been widely used to decompose monocular images into linear combinations of small numbers of basis vectors drawn from an overcomplete set. However, little work has examined sparse coding in the context of stereopsis. In this paper, we demonstrate that sparse coding facilitates better depth inference with sparse activations than comparable feed-forward networks of the same size. This is likely due to the noise and redundancy of feed-forward activations, whereas sparse coding utilizes lateral competition to selectively encode image features within a narrow band of depths

    Automated Remote Pulse Oximetry System (ARPOS)

    Get PDF
    Funding: This research is funded by the School of Computer Science and by St Leonard’s Postgraduate College Doctoral Scholarship, both at the University of St Andrews for Pireh Pirzada’s PhD. Early work was funded by the Digital Health & Care Innovation Centre (DHI).Current methods of measuring heart rate (HR) and oxygen levels (SPO2) require physical contact, are individualised, and for accurate oxygen levels may also require a blood test. No-touch or non-invasive technologies are not currently commercially available for use in healthcare settings. To date, there has been no assessment of a system that measures HR and SPO2 using commercial off-the-shelf camera technology that utilises R, G, B and IR data. Moreover, no formal remote photoplethysmography studies have been done in real life scenarios with participants at home with different demographic characteristics. This novel study addresses all these objectives by developing, optimising, and evaluating a system that measures the HR and SPO2 of 40 participants. HR and SPO2 are determined by measuring the frequencies from different wavelength band regions using FFT and radiometric measurements after pre-processing face regions of interest (forehead, lips, and cheeks) from Colour, IR and Depth data. Detrending, interpolating, hamming, and normalising the signal with FastICA produced the lowest RMSE of 7.8 for HR with the r-correlation value of 0.85 and RMSE 2.3 for SPO2. This novel system could be used in several critical care settings, including in care homes and in hospitals and prompt clinical intervention as required.Publisher PDFPeer reviewe

    Differentiable Gaussianization Layers for Inverse Problems Regularized by Deep Generative Models

    Full text link
    Deep generative models such as GANs and normalizing flows are powerful priors. They can regularize inverse problems to reduce ill-posedness and attain high-quality results. However, the latent vector of such deep generative models can fall out of the desired high-dimensional standard Gaussian distribution during an inversion, particularly in the presence of noise in data or inaccurate forward models. In such a case, deep generative models are ineffective in attaining high-fidelity solutions. To address this issue, we propose to reparameterize and Gaussianize the latent vector using novel differentiable data-dependent layers wherein custom operators are defined by solving optimization problems. These proposed layers constrain an inversion to find feasible in-distribution solutions. We tested and validated our technique on three inversion tasks: compressive-sensing MRI, image deblurring, and eikonal tomography (a nonlinear PDE-constrained inverse problem), using two representative deep generative models: StyleGAN2 and Glow, and achieved state-of-the-art results.Comment: 26 pages, 15 figures, 9 table
    corecore