2 research outputs found

    An Intelligent System for Induction Motor Health Condition Monitoring

    Get PDF
    Induction motors (IMs) are commonly used in both industrial applications and household appliances. An IM online condition monitoring system is very useful to identify the IM fault at its initial stage, in order to prevent machinery malfunction, decreased productivity and even catastrophic failures. Although a series of research efforts have been conducted over decades for IM fault diagnosis using various approaches, it still remains a challenging task to accurately diagnose the IM fault due to the complex signal transmission path and environmental noise. The objective of this thesis is to develop a novel intelligent system for more reliable IM health condition monitoring. The developed intelligent monitor consists of two stages: feature extraction and decision-making. In feature extraction, a spectrum synch technique is proposed to extract representative features from collected stator current signals for fault detection in IM systems. The local bands related to IM health conditions are synchronized to enhance fault characteristic features; a central kurtosis method is suggested to extract representative information from the resulting spectrum and to formulate an index for fault diagnosis. In diagnostic pattern classification, an innovative selective boosting technique is proposed to effectively classify representative features into different IM health condition categories. On the other hand, IM health conditions can also be predicted by applying appropriate prognostic schemes. In system state forecasting, two forecasting techniques, a model-based pBoost predictor and a data-driven evolving fuzzy neural predictor, are proposed to forecast future states of the fault indices, which can be employed to further improve the accuracy of IM health condition monitoring. A novel fuzzy inference system is developed to integrate information from both the classifier and the predictor for IM health condition monitoring. The effectiveness of the proposed techniques and integrated monitor is verified through simulations and experimental tests corresponding to different IM states such as IMs with broken rotor bars and with the bearing outer race defect. The developed techniques, the selective boosting classifier, pBoost predictor and evolving fuzzy neural predictor, are effective tools that can be employed in a much wider range of applications. In order to select the most reliable technique in each processing module so as to provide a more positive assessment of IM health conditions, some more techniques are also proposed for each processing purpose. A conjugate Levebnerg-Marquardt method and a Laplace particle swarm technique are proposed for model parameter training, whereas a mutated particle filter technique is developed for system state prediction. These strong tools developed in this work could also be applied to fault diagnosis and other applications

    A Dynamic AdaBoost Algorithm With Adaptive Changes of Loss Function

    No full text
    National Natural Science Foundation of China [61203176, 61174161]; Key Research Project of Fujian Province of China [2009H0044]; Fundamental Research Funds for the Central Universities in China, Xiamen University [2011121047201112G018, CXB2011035]; Natural Sciences and Engineering Research Council of CanadaAdaBoost is a method to improve a given learning algorithm's classification accuracy by combining its hypotheses. Adaptivity, one of the significant advantages of AdaBoost, makes AdaBoost maximize the smallest margin so that AdaBoost has good generalization ability. However, when the samples with large negative margins are noisy or atypical, the maximized margin is actually a "hard margin." The adaptive feature makes AdaBoost sensitive to the sampling fluctuations, and prone to overfitting. Therefore, the traditional schemes prevent AdaBoost from overfitting by heavily damping the influences of samples with large negative margins. However, the samples with large negative margins are not always noisy or atypical; thus, the traditional schemes of preventing overfitting may not be reasonable. In order to learn a classifier with high generalization performance and prevent overfitting, it is necessary to perform statistical analysis for the margins of training samples. Herein, Hoeffding inequality is adopted as a statistical tool to divide training samples into reliable samples and temporary unreliable samples. A new boosting algorithm, which is named DAdaBoost, is introduced to deal with reliable samples and temporary unreliable samples separately. Since DAdaBoost adjusts weighting scheme dynamically, the loss function of DAdaBoost is not fixed. In fact, it is a series of nonconvex functions that gradually approach the 0-1 function as the algorithm evolves. By defining a virtual classifier, the dynamic adjusted weighting scheme is well unified into the progress of DAdaBoost, and the upper bound of training error is deduced. The experiments on both synthetic and real world data show that DAdaBoost has many merits. Based on the experiments, we conclude that DAdaBoost can effectively prevent AdaBoost from overfitting
    corecore