4,078 research outputs found

    Fault Diagnosis of Rotating Equipment Bearing Based on EEMD and Improved Sparse Representation Algorithm

    Get PDF
    Aiming at the problem that the vibration signals of rolling bearings working in a harsh environment are mixed with many harmonic components and noise signals, while the traditional sparse representation algorithm takes a long time to calculate and has a limited accuracy, a bearing fault feature extraction method based on the ensemble empirical mode decomposition (EEMD) algorithm and improved sparse representation is proposed. Firstly, an improved orthogonal matching pursuit (adapOMP) algorithm is used to separate the harmonic components in the signal to obtain the filtered signal. The processed signal is decomposed by EEMD, and the signal with a kurtosis greater than three is reconstructed. Then, Hankel matrix transformation is carried out to construct the learning dictionary. The K-singular value decomposition (K-SVD) algorithm using the improved termination criterion makes the algorithm have a certain adaptability, and the reconstructed signal is constructed by processing the EEMD results. Through the comparative analysis of the three methods under strong noise, although the K-SVD algorithm can produce good results after being processed by the adapOMP algorithm, the effect of the algorithm is not obvious in the low-frequency range. The method proposed in this paper can effectively extract the impact component from the signal. This will have a positive effect on the extraction of rotating machinery impact features in complex noise environments

    Data-driven multivariate and multiscale methods for brain computer interface

    Get PDF
    This thesis focuses on the development of data-driven multivariate and multiscale methods for brain computer interface (BCI) systems. The electroencephalogram (EEG), the most convenient means to measure neurophysiological activity due to its noninvasive nature, is mainly considered. The nonlinearity and nonstationarity inherent in EEG and its multichannel recording nature require a new set of data-driven multivariate techniques to estimate more accurately features for enhanced BCI operation. Also, a long term goal is to enable an alternative EEG recording strategy for achieving long-term and portable monitoring. Empirical mode decomposition (EMD) and local mean decomposition (LMD), fully data-driven adaptive tools, are considered to decompose the nonlinear and nonstationary EEG signal into a set of components which are highly localised in time and frequency. It is shown that the complex and multivariate extensions of EMD, which can exploit common oscillatory modes within multivariate (multichannel) data, can be used to accurately estimate and compare the amplitude and phase information among multiple sources, a key for the feature extraction of BCI system. A complex extension of local mean decomposition is also introduced and its operation is illustrated on two channel neuronal spike streams. Common spatial pattern (CSP), a standard feature extraction technique for BCI application, is also extended to complex domain using the augmented complex statistics. Depending on the circularity/noncircularity of a complex signal, one of the complex CSP algorithms can be chosen to produce the best classification performance between two different EEG classes. Using these complex and multivariate algorithms, two cognitive brain studies are investigated for more natural and intuitive design of advanced BCI systems. Firstly, a Yarbus-style auditory selective attention experiment is introduced to measure the user attention to a sound source among a mixture of sound stimuli, which is aimed at improving the usefulness of hearing instruments such as hearing aid. Secondly, emotion experiments elicited by taste and taste recall are examined to determine the pleasure and displeasure of a food for the implementation of affective computing. The separation between two emotional responses is examined using real and complex-valued common spatial pattern methods. Finally, we introduce a novel approach to brain monitoring based on EEG recordings from within the ear canal, embedded on a custom made hearing aid earplug. The new platform promises the possibility of both short- and long-term continuous use for standard brain monitoring and interfacing applications

    Enhancing brain-computer interfacing through advanced independent component analysis techniques

    No full text
    A Brain-computer interface (BCI) is a direct communication system between a brain and an external device in which messages or commands sent by an individual do not pass through the brain’s normal output pathways but is detected through brain signals. Some severe motor impairments, such as Amyothrophic Lateral Sclerosis, head trauma, spinal injuries and other diseases may cause the patients to lose their muscle control and become unable to communicate with the outside environment. Currently no effective cure or treatment has yet been found for these diseases. Therefore using a BCI system to rebuild the communication pathway becomes a possible alternative solution. Among different types of BCIs, an electroencephalogram (EEG) based BCI is becoming a popular system due to EEG’s fine temporal resolution, ease of use, portability and low set-up cost. However EEG’s susceptibility to noise is a major issue to develop a robust BCI. Signal processing techniques such as coherent averaging, filtering, FFT and AR modelling, etc. are used to reduce the noise and extract components of interest. However these methods process the data on the observed mixture domain which mixes components of interest and noise. Such a limitation means that extracted EEG signals possibly still contain the noise residue or coarsely that the removed noise also contains part of EEG signals embedded. Independent Component Analysis (ICA), a Blind Source Separation (BSS) technique, is able to extract relevant information within noisy signals and separate the fundamental sources into the independent components (ICs). The most common assumption of ICA method is that the source signals are unknown and statistically independent. Through this assumption, ICA is able to recover the source signals. Since the ICA concepts appeared in the fields of neural networks and signal processing in the 1980s, many ICA applications in telecommunications, biomedical data analysis, feature extraction, speech separation, time-series analysis and data mining have been reported in the literature. In this thesis several ICA techniques are proposed to optimize two major issues for BCI applications: reducing the recording time needed in order to speed up the signal processing and reducing the number of recording channels whilst improving the final classification performance or at least with it remaining the same as the current performance. These will make BCI a more practical prospect for everyday use. This thesis first defines BCI and the diverse BCI models based on different control patterns. After the general idea of ICA is introduced along with some modifications to ICA, several new ICA approaches are proposed. The practical work in this thesis starts with the preliminary analyses on the Southampton BCI pilot datasets starting with basic and then advanced signal processing techniques. The proposed ICA techniques are then presented using a multi-channel event related potential (ERP) based BCI. Next, the ICA algorithm is applied to a multi-channel spontaneous activity based BCI. The final ICA approach aims to examine the possibility of using ICA based on just one or a few channel recordings on an ERP based BCI. The novel ICA approaches for BCI systems presented in this thesis show that ICA is able to accurately and repeatedly extract the relevant information buried within noisy signals and the signal quality is enhanced so that even a simple classifier can achieve good classification accuracy. In the ERP based BCI application, after multichannel ICA the data just applied to eight averages/epochs can achieve 83.9% classification accuracy whilst the data by coherent averaging can reach only 32.3% accuracy. In the spontaneous activity based BCI, the use of the multi-channel ICA algorithm can effectively extract discriminatory information from two types of singletrial EEG data. The classification accuracy is improved by about 25%, on average, compared to the performance on the unpreprocessed data. The single channel ICA technique on the ERP based BCI produces much better results than results using the lowpass filter. Whereas the appropriate number of averages improves the signal to noise rate of P300 activities which helps to achieve a better classification. These advantages will lead to a reliable and practical BCI for use outside of the clinical laboratory

    A New Signal Processing Approach to Study Action Potential Content in Sympathetic Neural Signals

    Get PDF
    Sympathetic nerve activity plays an essential role in the normal regulation of blood pressure in humans and in the etiology and progression of many chronic diseases. Sympathetic nerve recordings associated with blood pressure regulation can be recorded directly using microneurography. A general characteristic of this signal is spontaneous burst activity of spikes (action potentials) separated by silent periods against a background of considerable gaussian noise. During measurement with electrodes, the raw muscle sympathetic nerve activity (MSNA) signal is amplified, band-pass filtered, rectified and integrated. This integration process removes important information regarding action potential content and their discharge properties. The first objective of this thesis was to propose a new method for detecting action potentials from the raw MSNA signal to enable investigation of post-ganglionic neural discharge properties. The new method is based on the design of a mother wavelet that is matched to an actual mean action potential template extracted from a raw MSNA signal and applying it to the raw MSNA signal using a continues wavelet transform (CWT) for spike detection. The performance of the proposed method versus two previous wavelet-based approaches was evaluated using 1) MSNA recorded from seven healthy participants and, 2) simulated MSNA. The results show that the new matched wavelet performs better than the previous wavelet-based methods that use a non-matched wavelet in detecting action potentials in the MSNA signal. The second objective of this thesis was to employ the proposed action potential detection and classification technique to study the relationship between the recruitment of sympathetic action potentials (i.e., neurons) and the size of integrated sympathetic bursts in human MSNA signal. While in other neural systems (e.g. the skeletal motor system) there is a well understood pattern of neural recruitment during activation, our understanding of how sympathetic neurons are coordinated during baseline and baroreceptor unloading are very limited. We demonstrate that there exists a hierarchical pattern of recruitment of additional faster conducting neurons of larger amplitude as the sympathetic bursts become stronger. This information has important implications for how blood pressure is controlled, and the malleability of sympathetic activation in health and disease

    Transient thermography for flaw detection in friction stir welding : a machine learning approach

    Get PDF
    A systematic computational method to simulate and detect sub-surface flaws, through non-destructive transient thermography, in aluminium sheets and friction stir welded sheets is proposed. The proposed method relies on feature extraction methods and a data driven machine learning modelling structure. In this work, we propose the use of a multi-layer perceptron feed-forward neural-network with feature extraction methods to improve the flaw-probing depth of transient thermography inspection. Furthermore, for the first time, we propose Thermographic Signal Linear Modelling (TSLM), a hyper-parameterfree feature extraction technique for transient thermography. The new feature extraction and modelling framework was tested with out-of-sample experimental transient thermography data and results show effectiveness in sub-surface flaw detection of up to 2.3 mm deep in aluminium sheets (99.8 % true positive rate, 92.1 % true negative rate) and up to 2.2 mm deep in friction stir welds (97.2 % true positive rate, 87.8 % true negative rate)

    Construction of FASR subsystem testbed and application for solar burst trajectories and RFI study

    Get PDF
    The construction of the Frequency Agile Solar Radiotelescope (FASR) Subsystem Testbed (FST) and observational results are described. Three antennas of Owens Valley Solar Array (OVSA) have been upgraded with newly designed, state of art technology. The 1-9 GHz RF signal from the antenna feed is transmitted via broadband (45 MHz-9.5 GHz) optical fiber links to the control room. The RF is then downconverted to a 500 MHz, single-sideband signal that can be tuned across the 1-9 GHz RF band. The data are sampled with an 8-bit, 1 GHz sampling-rate digitizer, and further saved to a computer hard disk. The full-resolution time-domain data thus recorded are then correlated through offline software to provide phase and amplitude spectra. An important feature of this approach is that the data can be reanalyzed multiple times with different digital signal-processing techniques (e.g., different bit-sampling, windowing, and RFI excision methods) to test the effects of different designs. As a prototype of the FASR system, FST provides the opportunity to study the design, calibration and interference-avoidance requirements of FASR. In addition, FST provides, for the first time, the ability to perform broadband spectroscopy of the Sun with high spectral, temporal and moderate spatial resolution. With this three-element interferometer, one has the ability to determine the location of simple sources with spectrograph-like time and frequency resolution. The large solar flare of 2006 December 6 was detected by the newly constructed FASR Subsystem Testbed, which is operating on three antennas of Owens Valley Solar Array. This record-setting burst produced an especially fine set of fiber bursts--so-called intermediate-drift bursts that drift from high to low frequencies over 6-10 s. According to a leading theory (Kuijpers 1975), the fibers are generated by packets of whistler waves propagating along a magnetic loop, which coalesce with Langmuir waves to produce escaping electromagnetic radiation in the decimeter band. With this three element interferometer, for the first time fiber burst source locations can be determined relative to the background even though the absolute location is still unkown for the lack of phase calibration information. The radio information over a 500 MHz band (1.0-1.5 GHz) was used to determine the trajectories of the bursts. Since the digital data are recorded with full resolution and processed offline, a key advantage of it is that one can process the data in different ways in order to simulate and test hardware implementations. FST data provides a unique testbed for studying methods of RFI excision. RFI is observed to be present in every one of the 500 MHz bands, and the high time and frequency resolution provided by FST allows one to characterize it in great detail. The use of time-domain kurtosis, and a variant of the kurtosis method in the frequency domain were explored to identify the presence of RFI and flag bad channels in simulated real time (i.e., we play back the raw, full-resolution recorded data and flag the bad channels during play-back just as a real-time system would do). The ability to select alternate RFI excision algorithms during play-back allows one to compare algorithms on an equal basis. From the same data set, the two kurtosis (time domain and frequency domain) RFI excision algorithms were compared. The results are compared quantitatively to show that the spectral kurtosis is more effective than time domain kurtosis algorithm for detecting the RFI contamination, as expected from theoretical considerations
    • 

    corecore