5,081 research outputs found

    On the detection of myocardial scar based on ECG/VCG analysis

    No full text
    In this paper, we address the problem of detecting the presence of myocardial scar from standard ECG/VCG recordings, giving effort to develop a screening system for the early detection of scar in the point-of-care. Based on the pathophysiological implications of scarred myocardium, which results in disordered electrical conduction, we have implemented four distinct ECG signal processing methodologies in order to obtain a set of features that can capture the presence of myocardial scar. Two of these methodologies: a.) the use of a template ECG heartbeat, from records with scar absence coupled with Wavelet coherence analysis and b.) the utilization of the VCG are novel approaches for detecting scar presence. Following, the pool of extracted features is utilized to formulate an SVM classification model through supervised learning. Feature selection is also employed to remove redundant features and maximize the classifier's performance. Classification experiments using 260 records from three different databases reveal that the proposed system achieves 89.22% accuracy when applying 10- fold cross validation, and 82.07% success rate when testing it on databases with different inherent characteristics with similar levels of sensitivity (76%) and specificity (87.5%)

    Source finding, parametrization and classification for the extragalactic Effelsberg-Bonn HI Survey

    Full text link
    Context. Source extraction for large-scale HI surveys currently involves large amounts of manual labor. For data volumes expected from future HI surveys with upcoming facilities, this approach is not feasible any longer. Aims. We describe the implementation of a fully automated source finding, parametrization, and classification pipeline for the Effelsberg-Bonn HI Survey (EBHIS). With future radio astronomical facilities in mind, we want to explore the feasibility of a completely automated approach to source extraction for large-scale HI surveys. Methods. Source finding is implemented using wavelet denoising methods, which previous studies show to be a powerful tool, especially in the presence of data defects. For parametrization, we automate baseline fitting, mask optimization, and other tasks based on well-established algorithms, currently used interactively. For the classification of candidates, we implement an artificial neural network which is trained on a candidate set comprised of false positives from real data and simulated sources. Using simulated data, we perform a thorough analysis of the algorithms implemented. Results. We compare the results from our simulations to the parametrization accuracy of the HI Parkes All-Sky Survey (HIPASS) survey. Even though HIPASS is more sensitive than EBHIS in its current state, the parametrization accuracy and classification reliability match or surpass the manual approach used for HIPASS data.Comment: 13 Pages, 13 Figures, 1 Table, accepted for publication in A&

    A Review of Wavelet Based Fingerprint Image Retrieval

    Get PDF
    A digital image is composed of pixels and information about brightness of image and RGB triples are used to encode color information. Image retrieval problem encountered when searching and retrieving images that is relevant to a user’s request from a database. In Content based image retrieval, input goes in the form of an image. In these images, different features are extracted and then the other images from database are retrieved accordingly. Biometric distinguishes the people by their physical or behavioral qualities. Fingerprints are viewed as a standout amongst the most solid for human distinguishment because of their uniqueness and ingenuity. To retrieve fingerprint images on the basis of their textural features,by using different wavelets. From the input fingerprint image, first of all center point area is selected and then its textural features are extracted and stored in database. When a query image comes then again its center point is selected and then its texture feature are extracted. Then these features are matched for similarity and then resultant image is displayed. DOI: 10.17762/ijritcc2321-8169.15026

    Application of Wavelets and Principal Component Analysis in Image Query and Mammography

    Get PDF
    Breast cancer is currently one of the major causes of death for women in the U.S. Mammography is currently the most effective method for detection of breast cancer and early detection has proven to be an efficient tool to reduce the number of deaths. Mammography is the most demanding of all clinical imaging applications as it requires high contrast, high signal to noise ratio and resolution with minimal x-radiation. According to studies [36], 10% to 30% of women having breast cancer and undergoing mammography, have negative mammograms, i.e. are misdiagnosed. Furthermore, only 20%-40% of the women who undergo biopsy, have cancer. Biopsies are expensive, invasive and traumatic to the patient. The high rate of false positives is partly because of the difficulties in the diagnosis process and partly due to the fear of missing a cancer. These facts motivate research aimed to enhance the mammogram images (e.g. by enhancement of features such as clustered calcification regions which were found to be associated with breast cancer) , to provide CAD (Computer Aided Diagnostics) tools that can alert the radiologist to potentially malignant regions in the mammograms and to develope tools for automated classification of mammograms into benign and malignant classes. In this paper we apply wavelet and Principal Component analysis, including the approximate Karhunen Loeve aransform to mammographic images, to derive feature vectors used for classification of mammographic images from an early stage of malignancy. Another area where wavelet analysis was found useful, is the area of image query. Image query of large data bases must provide a fast and efficient search of the query image. Lately, a group of researchers developed an algorithm based on wavelet analysis that was found to provide fast and efficient search in large data bases. Their method overcomes some of the difficulties associated with previous approaches, but the search algorithm is sensitive to displacement and rotation of the query image due to the fact that wavelet analysis is not invariant under displacement and rotation. In this study we propose the integration of the Hotelling transform to improve on this sensitivity and provide some experimental results in the context of the standard alphabetic characters

    Using Remote Monitoring And Machine Learning To Classify Slam Events Of Wave Piercing Catamarans

    Get PDF
    An onboard monitoring system can measure features such as stress cycles counts and provide warnings due to slamming. Considering current technology trends there is the opportunity of incorporating machine learning methods into monitoring systems. A hull monitoring system has been developed and installed on a 111 m wave piercing catamaran (Hull 091) to remotely monitor the ship kinematics and hull structural responses. Parallel to that, an existing dataset of a similar vessel (Hull 061) was analysed using unsupervised and supervised learning models; these were found to be beneficial for the classification of bow entry events according to key kinematic parameters. A comparison of different algorithms including linear support vector machines, naïve Bayes and decision tree for the bow entry classification were conducted. In addition, using empirical probability distributions, the likelihood of wet-deck slamming was estimated given a vertical bow acceleration threshold of 1 in head seas, clustering the feature space with the approximate probabilities of 0.001, 0.030 and 0.25

    Chandra Orion Ultradeep Project: Observations and Source Lists

    Full text link
    We present a description of the data reduction methods and the derived catalog of more than 1600 X-ray point sources from the exceptionally deep January 2003 Chandra X-ray Observatory observation of the Orion Nebula Cluster and embedded populations around OMC-1. The observation was obtained with Chandra's Advanced CCD Imaging Spectrometer (ACIS) and has been nicknamed the Chandra Orion Ultradeep Project (COUP). With an 838 ks exposure made over a continuous period of 13.2 days, the COUP observation provides the most uniform and comprehensive dataset on the X-ray emission of normal stars ever obtained in the history of X-ray astronomy.Comment: 52 pages, 11 figures, 12 tables. Accepted for publication in ApJS, special issue dedicated to Chandra Orion Ultradeep Project. A version with high quality figures can be found at http://www.astro.psu.edu/users/gkosta/COUP_Methodology.pd

    Automatic Alignment of 3D Multi-Sensor Point Clouds

    Get PDF
    Automatic 3D point cloud alignment is a major research topic in photogrammetry, computer vision and computer graphics. In this research, two keypoint feature matching approaches have been developed and proposed for the automatic alignment of 3D point clouds, which have been acquired from different sensor platforms and are in different 3D conformal coordinate systems. The first proposed approach is based on 3D keypoint feature matching. First, surface curvature information is utilized for scale-invariant 3D keypoint extraction. Adaptive non-maxima suppression (ANMS) is then applied to retain the most distinct and well-distributed set of keypoints. Afterwards, every keypoint is characterized by a scale, rotation and translation invariant 3D surface descriptor, called the radial geodesic distance-slope histogram. Similar keypoints descriptors on the source and target datasets are then matched using bipartite graph matching, followed by a modified-RANSAC for outlier removal. The second proposed method is based on 2D keypoint matching performed on height map images of the 3D point clouds. Height map images are generated by projecting the 3D point clouds onto a planimetric plane. Afterwards, a multi-scale wavelet 2D keypoint detector with ANMS is proposed to extract keypoints on the height maps. Then, a scale, rotation and translation-invariant 2D descriptor referred to as the Gabor, Log-Polar-Rapid Transform descriptor is computed for all keypoints. Finally, source and target height map keypoint correspondences are determined using a bi-directional nearest neighbour matching, together with the modified-RANSAC for outlier removal. Each method is assessed on multi-sensor, urban and non-urban 3D point cloud datasets. Results show that unlike the 3D-based method, the height map-based approach is able to align source and target datasets with differences in point density, point distribution and missing point data. Findings also show that the 3D-based method obtained lower transformation errors and a greater number of correspondences when the source and target have similar point characteristics. The 3D-based approach attained absolute mean alignment differences in the range of 0.23m to 2.81m, whereas the height map approach had a range from 0.17m to 1.21m. These differences meet the proximity requirements of the data characteristics and the further application of fine co-registration approaches

    Linear friction weld process monitoring of fixture cassette deformations using empirical mode decomposition

    Get PDF
    Due to its inherent advantages, linear friction welding is a solid-state joining process of increasing importance to the aerospace, automotive, medical and power generation equipment industries. Tangential oscillations and forge stroke during the burn-off phase of the joining process introduce essential dynamic forces, which can also be detrimental to the welding process. Since burn-off is a critical phase in the manufacturing stage, process monitoring is fundamental for quality and stability control purposes. This study aims to improve workholding stability through the analysis of fixture cassette deformations. Methods and procedures for process monitoring are developed and implemented in a fail-or-pass assessment system for fixture cassette deformations during the burn-off phase. Additionally, the de-noised signals are compared to results from previous production runs. The observed deformations as a consequence of the forces acting on the fixture cassette are measured directly during the welding process. Data on the linear friction-welding machine are acquired and de-noised using empirical mode decomposition, before the burn-off phase is extracted. This approach enables a direct, objective comparison of the signal features with trends from previous successful welds. The capacity of the whole process monitoring system is validated and demonstrated through the analysis of a large number of signals obtained from welding experiments

    Linear friction weld process monitoring of fixture cassette deformations using empirical mode decomposition

    Get PDF
    Due to its inherent advantages, linear friction welding is a solid-state joining process of increasing importance to the aerospace, automotive, medical and power generation equipment industries. Tangential oscillations and forge stroke during the burn-off phase of the joining process introduce essential dynamic forces, which can also be detrimental to the welding process. Since burn-off is a critical phase in the manufacturing stage, process monitoring is fundamental for quality and stability control purposes. This study aims to improve workholding stability through the analysis of fixture cassette deformations. Methods and procedures for process monitoring are developed and implemented in a fail-or-pass assessment system for fixture cassette deformations during the burn-off phase. Additionally, the de-noised signals are compared to results from previous production runs. The observed deformations as a consequence of the forces acting on the fixture cassette are measured directly during the welding process. Data on the linear friction-welding machine are acquired and de-noised using empirical mode decomposition, before the burn-off phase is extracted. This approach enables a direct, objective comparison of the signal features with trends from previous successful welds. The capacity of the whole process monitoring system is validated and demonstrated through the analysis of a large number of signals obtained from welding experiments
    corecore