157 research outputs found

    Automated Source Extraction for the Next Generation of Neutral Hydrogen Surveys

    Get PDF
    This thesis is a first step to develop the necessary tools to automatically extract and parameterize sources from future HI surveys with ASKAP, WSRT/Apertif, and SKA. The current approach to large-scale HI surveys, that is, automated source finding followed by manual classification and parametrization, is no longer feasible in light of the data volumes expected for future surveys. We use data from EBHIS to develop and test a completely automated source extraction pipeline for extragalactic HI surveys. We apply a 2D-1D wavelet de-noising technique to HI data and show that it is well adapted to the typical shapes of sources encountered in HI surveys. This technique allows to reliably extract sources even from data containing defects commonly encountered in single-dish HI surveys. Automating the task of false-positive rejection requires reliable parameters for all source candidates generated by the source-finding step. For this purpose, we develop a reliable, automated parametrization pipeline that combines time-tested algorithms with new approaches to baseline estimation, spectral filtering, and mask optimization. The accuracy of the algorithms is tested by performing extensive simulations. By comparison with the uncertainty estimates from HIPASS we show that our automated pipeline gives equal or better accuracy than manual parametrization. We implement the task of source classification using artificial neural networks using the automatically determined parameters of the source candidates as inputs. The viability of this approach is verified on a training data set comprised of parameters measured from simulated sources and false positives extracted from real EBHIS data. Since the number of true positives from real data is small compared to the number of false positives, we explore various methods of training artificial neural networks from imbalanced data sets. We show that the artificial neural networks trained in this way do not achieve sufficient completeness and reliability when applied to the source candidates detected from the extragalactic EBHIS survey. We use the trained artificial neural networks in a semi-supervised manner to compile the first extragalactic EBHIS source catalog. The use of artificial neural networks reduces the number of source candidates that require manual inspection by more than an order of magnitude. We compare the results from EBHIS to HIPASS and show that the number of sources in the compiled catalog is approximately half of the sources expected. The main reason for this detection inefficiency is identified to be mis-classification by the artificial neural networks. This is traced back to the limited training data set, which does not cover the parameter space of real detections sufficiently, and the similarity of true and false positives in the parameter space spanned by the measured parameters. We conclude that, while our automated source finding and parametrization algorithms perform satisfactorily, the classification of sources is the most challenging task for future HI surveys. Classification based on the measured source parameters does not provide sufficient discriminatory power and we propose to explore methods based on machine vision which learns features of real sources from the data directly

    A Survey on Image Noises and Denoise Techniques

    Get PDF
    Abstract-Digital images are noisy due to environmental disturbances. To ensure image quality, image processing of noise reduction is a very important step before analysis or using images. Data sets collected by image sensors are generally contaminated by noise. Imperfect instruments, problems with the data acquisition process, and interfering natural phenomena can all degrade the data of interest. The importance of the image denoising could be a serious task for medical imaging, satellite and areal image processing, robot vision, industrial vision systems, micro vision systems, space exploring etc. The noise is characterized by its pattern and by its probabilistic characteristics. There is a wide variety of noise types while we focus on the most important types of noises and de noise filters been developed to reduce noise from corrupted images to enhance image quality

    Classification of Arrhythmia from ECG Signals using MATLAB

    Get PDF
    An Electrocardiogram (ECG) is defined as a test that is performed on the heart to detect any abnormalities in the cardiac cycle. Automatic classification of ECG has evolved as an emerging tool in medical diagnosis for effective treatments. The work proposed in this paper has been implemented using MATLAB. In this paper, we have proposed an efficient method to classify the ECG into normal and abnormal as well as classify the various abnormalities. To brief it, after the collection and filtering the ECG signal, morphological and dynamic features from the signal were obtained which was followed by two step classification method based on the traits and characteristic evaluation. ECG signals in this work are collected from MIT-BIH, AHA, ESC, UCI databases. In addition to this, this paper also provides a comparative study of various methods proposed via different techniques. The proposed technique used helped us process, analyze and classify the ECG signals with an accuracy of 97% and with good convenience

    Automatic multi-resolution spatio-frequency mottle metric (sfmm) for evaluation of macrouniformity

    Get PDF
    Evaluation of mottle is an area of on-going research in print quality assessment. We propose an unsupervised evaluation technique and a metric that measures mottle in a hard-copy laser print. The proposed algorithm uses a scanned image to quantify the low frequency variation or mottle in what is supposed to be a uniform field. `Banding\u27 and `Streaking\u27 effects are explicitly ignored and the proposed algorithm scales the test targets from Flat print (Good) to Noisy print (Bad) based on mottle only. The evaluation procedure is modeled as feature computation in different combinations of spatial, frequency and wavelet domains. The model is primarily independent of the nature of the input test target, i.e. whether it is chromatic or achromatic. The algorithm adapts accordingly and provides a mottle metric for any test target. The evaluation process is done using three major modules: (1) Pre-processing Stage, which includes acquisition of the test target and preparing it for processing; (2) Spatio-frequency Parameter Estimation where different features characterizing mottle are calculated in spatial and frequency domains; (3) Invalid Feature Removal Stage, where the invalid or insignificant features (in context to mottle) are eliminated and the dataset is ranked relatively. The algorithm was demonstrated successfully on a collection of 60 K-Only printed images spread over 2 datasets printed on 3 different faulty printers and 4 different media Also, it was tested on 5 color targets for the color version of the algorithm printed using 2 different printers and 5 different media, provided by Hewlett Packard Company

    A study on the application of independent component analysis to in vivo ¹H magnetic resonance spectra of childhood brain tumours for data processing

    Get PDF
    Independent component analysis (ICA) has the potential of automatically determining metabolite, macromolecular and lipid (MMLip) components that make up magnetic resonance (MR) spectra. However, the realiability with which this is accomplished and the optimal ICA approach for investigating in vivo MR spectra, have not yet been determined. A wavelet shrinkage de-noising based enhancement algorithm, utilising a newly derived relationship between the real and imaginary parts of the MR spectrum, is proposed. This algorithm is more robust compared with conventional de-noising methods. The two approaches for applying ICA, blind source separation (BSS) and feature extraction (FE), are thoroughly examined. A feature dimension selection method, which has not been adequately addressed, is proposed to set a theoretical guideline for ICA dimension reduction. Since the advantages and limitations of BSS-ICA and FE-ICA are different, combining them may compensate their disadvantages and lead to better results. A novel ICA approach involving a hybrid of the two techniques for automated decomposition of MRS dataset is proposed. It has been demonstrated that hybrid ICA provides more realistic individual metabolite and MMLip components than BSS-ICA or FE-ICA. It can aid metabolite identification and assignment, and has the potential for extracting biologically useful features and discovering biomarkers.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Development of an acoustic emission monitoring system for crack detection during arc welding

    Get PDF
    Condition monitoring techniques are employed to monitor the structural integrity of a structure or the performance of a process. They are used to evaluate the structural integrity including damage initiation and propagation in engineering components. Early damage detection, maintenance and repairs can prevent structural failures, reduce maintenance and replacement costs, and guarantee that the structure runs securely during its service life. Acoustic emission (AE) technology is one of the condition monitoring methods widely employed in the industry. AE is an attractive option for condition monitoring purposes, the number of industrial applications where is used is rising. AE signals are elastic stress waves created by the fast release of energy from local sources occurring inside of materials, e.g. crack initiating and propagating. The AE technique includes recording this phenomenon with piezoelectric sensors, which is mounted on the surface of a structure. The signals are subsequently analysed in order to extract useful information about the nature of the AE source. AE has a high sensitivity to crack propagation and able to locate AE activity sources. It is a passive approach. It listens to the elastic stress waves releasing from material and able to operate in real-time monitoring to detect both cracks initiating and propagating. In this study, the use of AE technology to detect and monitor the possible occurrence of cracking during the arc welding process has been investigated. Real-time monitoring of the automated welding operation can help increase productivity and reliability while reducing cost. Monitoring of welding processes using AE technology remains a challenge, especially in the field of real-time data analysis, since a large amount of data is generated during monitoring. Also, during the welding process, many interferences can occur, causing difficulties in the identifications of the signals related to cracking events. A significant issue in the practical use of the AE technique is the existence of independent sources of a signal other than those related to cracking. These spurious AE signals make the discovering of the signals from the cracking activity difficult. Therefore, it is essential to discriminate the signal to identify the signal source. The need for practical data analysis is related to the three main objectives of monitoring, which is where this study has focused on. Firstly, the assessment of the noise levels and the characteristics of the signal from different materials and processes, secondly, the identification of signals arising from cracking and thirdly, the study of the feasibility of online monitoring using the AE features acquired in the initial study. Experimental work was carried out under controlled laboratory conditions for the acquisition of AE signals during arc welding processing. AE signals have been used for the assessment of noise levels as well as to identify the characteristics of the signals arising from different materials and processes. The features of the AE signals arising from cracking and other possible signal sources from the welding process and environment have also collected under laboratory conditions and analysed. In addition to the above mentioned aspects of the study, two novel signal processing methods based on signal correlation have been developed for efficiently evaluating data acquired from AE sensors. The major contributions of this research can be summarised as follows. The study of noise levels and filtering of different arc welding processes and materials is one of the areas where the original contribution is identified with respect to current knowledge. Another key contribution of the present study is the developing of a model for achieving source discrimination. The crack-related signals and other signals arising from the background are compared with each other. Two methods that have the potential to be used in a real-time monitoring system have been considered based on cross-correlation and pattern recognition. The present thesis has contributed to the improvement of the effectiveness of the AE technique for the detection of the possible occurrence of cracking during arc welding

    Application of Stationary Wavelet Support Vector Machines for the Prediction of Economic Recessions

    Get PDF
    This paper examines the efficiency of various approaches on the classification and prediction of economic expansion and recession periods in United Kingdom. Four approaches are applied. The first is discrete choice models using Logit and Probit regressions, while the second approach is a Markov Switching Regime (MSR) Model with Time-Varying Transition Probabilities. The third approach refers on Support Vector Machines (SVM), while the fourth approach proposed in this study is a Stationary Wavelet SVM modelling. The findings show that SW-SVM and MSR present the best forecasting performance, in the out-of sample period. In addition, the forecasts for period 2012-2015 are provided using all approaches

    Artificial intelligence based ECG signal classification of sendetary, smokers and athletes

    Get PDF
    The current study deals with the design of a computer aided diagnosis procedure to classify 3 groups of people with different lifestyles, namely sedentary, smoker and athletes. The ECG Classification based on statistical analysis of HRV and ECG features. The heart rate variability (HRV) parameters and ECG statistical features were used for the pattern recognition in Artificial Intelligence classifiers. The ECG was recorded for a particular time duration using the EKG sensor. The HRV, time domain and wavelet parameters were calculated using NI BIOMEDICAL STARTUP KIT 3.0 and LABVIEW 2010. The important HRV features, time domain and wavelet features were calculated by the statistical non-linear classifiers (CART and BT).the important parameters were fed as input to artificial intelligence classifiers like ANN and SVM. The Artificial Intelligence classifiers like artificial neural network (ANN) and Support vector Machine (SVM) were used to classify 60 numbers of ECG signal. It was observed from result that the Multi layer perceptron (MLP) based ANN classifier gives an accuracy of 95%, which is highest among other the classifiers. The HRV study implies that the time domain parameters (RMSSD and PNN50), frequency domain parameters (HF power and LF/HF peak), Poincare parameter (SD1) and geometric parameters (RR triangular index and TINN) are higher in athlete class and lower in smoker class. The Higher values of HRV parameters indicate increase in parasympathetic activity and decrease in sympathetic activity of the ANS. This indicates that the athlete class has better heath and less chance of cardiovascular diseases where smoker class has high chances of cardiovascular diseases. These HRV parameters of sedentary class were higher than smoker class but lower than athlete class. This indicates less chances of cardiovascular disease in sedentary class as compared to smoker class

    Chronology of brain tumor classification of intelligent systems based on mathematical modeling, simulation and image processing techniques

    Get PDF
    Tumor classification using image processing techniques is becoming a powerful tool nowadays. Based on the importance of this technique, the motivation of this review paper is to present the chronology of brain tumor classification using the digital images and govern the mathematical modeling and simulation of intelligent systems. The intelligent system involves artificial neural network (ANN), fuzzy logic (FL), support vector machine (SVM), and parallel support vector machine (PSVM). The chronology of brain tumor classification presents the latest part of the literature reviews related to the principal, type and interpretation of segmentation and classification of brain tumors via the large digital dataset from magnetic resonance imaging (MRI) images. This paper has been classified the modeling and simulation in classical and automatic models. Around 115 literature reviews in high ranking journal and high citation index are referred. This paper contains 6 contents, including mathematical modeling, numerical simulation, image processing, numerical results and performance, lastly is the conclusion to standardize the frame concept for the future of chronological framework involving the mathematical modeling and simulation. Research outcome to differentiate the tumor classification based on MRI images, modeling and simulation. Future work outlier in segmentation and classification are given in conclusion
    corecore