842 research outputs found

    Towards the text compression based feature extraction in high impedance fault detection

    Get PDF
    High impedance faults of medium voltage overhead lines with covered conductors can be identified by the presence of partial discharges. Despite it is a subject of research for more than 60 years, online partial discharges detection is always a challenge, especially in environment with heavy background noise. In this paper, a new approach for partial discharge pattern recognition is presented. All results were obtained on data, acquired from real 22 kV medium voltage overhead power line with covered conductors. The proposed method is based on a text compression algorithm and it serves as a signal similarity estimation, applied for the first time on partial discharge pattern. Its relevancy is examined by three different variations of classification model. The improvement gained on an already deployed model proves its quality.Web of Science1211art. no. 214

    The severity of stages estimation during hemorrhage using error correcting output codes method

    Get PDF
    As a beneficial component with critical impact, computer-aided decision making systems have infiltrated many fields, such as economics, medicine, architecture and agriculture. The latent capabilities for facilitating human work propel high-speed development of such systems. Effective decisions provided by such systems greatly reduce the expense of labor, energy, budget, etc. The computer-aided decision making system for traumatic injuries is one type of such systems that supplies suggestive opinions when dealing with the injuries resulted from accidents, battle, or illness. The functions may involve judging the type of illness, allocating the wounded according to battle injuries, deciding the severity of symptoms for illness or injuries, managing the resources in the context of traumatic events, etc. The proposed computer-aided decision making system aims at estimating the severity of blood volume loss. Specifically speaking, accompanying many traumatic injuries, severe hemorrhage, a potentially life-threatening condition that requires immediate treatment, is a significant loss of blood volume in process resulting in decreased blood and oxygen perfusion of vital organs. Hemorrhage and blood loss can occur in different levels such as mild, moderate, or severe. Our proposed system will assist physicians by estimating information such as the severity of blood volume loss and hemorrhage , so that timely measures can be taken to not only save lives but also reduce the long-term complications as well as the cost caused by unmatched operations and treatments. The general framework of the proposed research contains three tasks and many novel and transformative concepts are integrated into the system. First is the preprocessing of the raw signals. In this stage, adaptive filtering is adopted and customized to filter noise, and two detection algorithms (QRS complex detection and Systolic/Diastolic wave detection) are designed. The second process is to extract features. The proposed system combines features from time domain, frequency domain, nonlinear analysis, and multi-model analysis to better represent the patterns when hemorrhage happens. Third, a machine learning algorithm is designed for classification of patterns. A novel machine learning algorithm, as a new version of error correcting output code (ECOC), is designed and investigated for high accuracy and real-time decision making. The features and characteristics of this machine learning method are essential for the proposed computer-aided trauma decision making system. The proposed system is tested agasint Lower Body Negative Pressure (LBNP) dataset, and the results indicate the accuracy and reliability of the proposed system

    A quantitative diagnosis method for rolling element bearing using signal complexity and morphology filtering

    Get PDF
    This paper considers a quantitative method for assessment of fault severity of rolling element bearing by means of signal complexity and morphology filtering. The relationship between the complexity and bearing fault severity is explained. The improved morphology filtering is adopted to avoid the ambiguity between severity fault and the pure random noise since both of them will acquire higher complexity value. According to the attenuation signal characteristics of a faulty bearing the artificial immune optimization algorithm with the target of pulse index is used to obtain optimal filtering signal. Furthermore, complexity algorithm is revised to avoid the loss of weak impact signal. After largely removing noise and other unrelated signal components, the complexity value will be mostly affected by the bearing system and therefore may be adopted as a reliable quantitative bearing fault diagnosis method. Application of the proposed approach to the bearing fault signals has demonstrated that the improved morphology filtering and the complexity of signal can be used to adequately evaluate bearing fault severity

    Mixing Bandt-Pompe and Lempel-Ziv approaches: another way to analyze the complexity of continuous-states sequences

    Get PDF
    In this paper, we propose to mix the approach underlying Bandt-Pompe permutation entropy with Lempel-Ziv complexity, to design what we call Lempel-Ziv permutation complexity. The principle consists of two steps: (i) transformation of a continuous-state series that is intrinsically multivariate or arises from embedding into a sequence of permutation vectors, where the components are the positions of the components of the initial vector when re-arranged; (ii) performing the Lempel-Ziv complexity for this series of `symbols', as part of a discrete finite-size alphabet. On the one hand, the permutation entropy of Bandt-Pompe aims at the study of the entropy of such a sequence; i.e., the entropy of patterns in a sequence (e.g., local increases or decreases). On the other hand, the Lempel-Ziv complexity of a discrete-state sequence aims at the study of the temporal organization of the symbols (i.e., the rate of compressibility of the sequence). Thus, the Lempel-Ziv permutation complexity aims to take advantage of both of these methods. The potential from such a combined approach - of a permutation procedure and a complexity analysis - is evaluated through the illustration of some simulated data and some real data. In both cases, we compare the individual approaches and the combined approach.Comment: 30 pages, 4 figure

    ALIGNMENT-FREE METHODS AND ITS APPLICATIONS

    Get PDF
    Comparing biological sequences remains one of the most vital activities in Bioinformatics. Comparing biological sequences would address the relatedness between species, and find similar structures that might lead to similar functions. Sequence alignment is the default method, and has been used in the domain for over four decades. It gained a lot of trust, but limitations and even failure has been reported, especially with the new generated genomes. These new generated genomes have bigger size, and to some extent suffer errors. Such errors come mainly as a result from the sequencing machine. These sequencing errors should be considered when submitting sequences to GenBank, for sequence comparison, it is often hard to address or even trace this problem. Alignment-based methods would fail with such errors, and even if biologists still trust them, reports showed failure with these methods. The poor results of alignment-based methods with erratic sequences, motivated researchers in the domain to look for alternatives. These alternative methods are alignment-free, and would overcome the shortcomings of alignment-based methods. The work of this thesis is based on alignment-free methods, and it conducts an in-depth study to evaluate these methods, and find the right domain’s application for them. The right domain for alignment-free methods could be by applying them to data that were subjected to manufactured errors, and test the methods provide better comparison results with data that has naturally severe errors. The two techniques used in this work are compression-based and motif-based (or k-mer based, or signal based). We also addressed the selection of the used motifs in the second technique, and how to progress the results by selecting specific motifs that would enhance the quality of results. In addition, we applied an alignment-free method to a different domain, which is gene prediction. We are using alignment-free in gene prediction to speed up the process of providing high quality results, and predict accurate stretches in the DNA sequence, which would be considered parts of genes

    Improving the detection of mtbi via complexity analysis in resting - state magnetoencephalography

    Get PDF
    Diagnosis of mild Traumatic Brain Injury (mTBI) is difficult due to the variability of obvious brain lesions using imaging scans. A promising tool for exploring potential biomarkers for mTBI is magnetoencephalography which has the advantage of high spatial and temporal resolution. By adopting proper analytic tools from the field of symbolic dynamics like Lempel-Ziv complexity, we can objectively characterize neural network alterations compared to healthy control by enumerating the different patterns of a symbolic sequence. This procedure oversimplifies the rich information of brain activity captured via MEG. For that reason, we adopted neural-gas algorithm which can transform a time series into more than two symbols by learning brain dynamics with a small reconstructed error. The proposed analysis was applied to recordings of 30 mTBI patients and 50 normal controls in δ frequency band. Our results demonstrated that mTBI patients could be separated from normal controls with more than 97% classification accuracy based on high complexity regions corresponding to right frontal areas. In addition, a reverse relation between complexity and transition rate was demonstrated for both groups. These findings indicate that symbolic complexity could have a significant predictive value in the development of reliable biomarkers to help with the early detection of mTBI

    Complexity Measures for Quantifying Changes in Electroencephalogram in Alzheimer's Disease

    Get PDF
    Alzheimer’s disease (AD) is a progressive disorder that affects cognitive brain functions and starts many years before its clinical manifestations. A biomarker that provides a quantitative measure of changes in the brain due to AD in the early stages would be useful for early diagnosis of AD, but this would involve dealing with large numbers of people because up to 50% of dementia sufferers do not receive formal diagnosis. Thus, there is a need for accurate, low-cost, and easy to use biomarkers that could be used to detect AD in its early stages. Potentially, electroencephalogram (EEG) based biomarkers can play a vital role in early diagnosis of AD as they can fulfill these needs. This is a cross-sectional study that aims to demonstrate the usefulness of EEG complexity measures in early AD diagnosis. We have focused on the three complexity methods which have shown the greatest promise in the detection of AD, Tsallis entropy (TsEn), Higuchi Fractal Dimension (HFD), and Lempel-Ziv complexity (LZC) methods. Unlike previous approaches, in this study, the complexity measures are derived from EEG frequency bands (instead of the entire EEG) as EEG activities have significant association with AD and this has led to enhanced performance. The results show that AD patients have significantly lower TsEn, HFD, and LZC values for specific EEG frequency bands and for specific EEG channels and that this information can be used to detect AD with a sensitivity and specificity of more than 90%

    Ensemble approach for detection of depression using EEG features

    Get PDF
    Depression is a public health issue which severely affects one's well being and cause negative social and economic effect for society. To rise awareness of these problems, this publication aims to determine if long lasting effects of depression can be determined from electoencephalographic (EEG) signals. The article contains accuracy comparison for SVM, LDA, NB, kNN and D3 binary classifiers which were trained using linear (relative band powers, APV, SASI) and non-linear (HFD, LZC, DFA) EEG features. The age and gender matched dataset consisted of 10 healthy subjects and 10 subjects with depression diagnosis at some point in their lifetime. Several of the proposed feature selection and classifier combinations reached accuracy of 90% where all models where evaluated using 10-fold cross validation and averaged over 100 repetitions with random sample permutations.Comment: 8 pages, 2 figure
    corecore