71,885 research outputs found

    Building Gene Expression Profile Classifiers with a Simple and Efficient Rejection Option in R

    Get PDF
    Background: The collection of gene expression profiles from DNA microarrays and their analysis with pattern recognition algorithms is a powerful technology applied to several biological problems. Common pattern recognition systems classify samples assigning them to a set of known classes. However, in a clinical diagnostics setup, novel and unknown classes (new pathologies) may appear and one must be able to reject those samples that do not fit the trained model. The problem of implementing a rejection option in a multi-class classifier has not been widely addressed in the statistical literature. Gene expression profiles represent a critical case study since they suffer from the curse of dimensionality problem that negatively reflects on the reliability of both traditional rejection models and also more recent approaches such as one-class classifiers. Results: This paper presents a set of empirical decision rules that can be used to implement a rejection option in a set of multi-class classifiers widely used for the analysis of gene expression profiles. In particular, we focus on the classifiers implemented in the R Language and Environment for Statistical Computing (R for short in the remaining of this paper). The main contribution of the proposed rules is their simplicity, which enables an easy integration with available data analysis environments. Since in the definition of a rejection model tuning of the involved parameters is often a complex and delicate task, in this paper we exploit an evolutionary strategy to automate this process. This allows the final user to maximize the rejection accuracy with minimum manual intervention. Conclusions: This paper shows how the use of simple decision rules can be used to help the use of complex machine learning algorithms in real experimental setups. The proposed approach is almost completely automated and therefore a good candidate for being integrated in data analysis flows in labs where the machine learning expertise required to tune traditional classifiers might not be availabl

    Pattern Recognition in a High Pressure Time Projection Chamber prototype with a Micromegas readout for the 136Xe double beta decay

    Get PDF
    The objective of this study has been to analyse the pattern recognition potential for background discrimination in a high pressure Time Projection Chamber (TPC) equipped with Micromegas detectors looking for the 136Xe neutrinoless double beta decay (ββ0ν). In addition the commissioning of a medium size prototype equipped with pixelized Micromegas allows to do first recognition of tracks in long drift distances (38 cm) and to study the performance of these detectors on it. Results show good pattern recognition capabilities and energy resolutions, proven the operability of these detectors for a ββ0ν experiment. The study of the background rejection potential in a 100kg TPC operating with Xe gas at 10 bar and equipped with Micromegas to study the ββ0ν have been done with MonteCarlo simulations. After the application of discrimination algorithms based on pattern recognition a rejection power of six orders of magnitude was obtained while the signal efficiency is 40%

    Studying the Imaging Characteristics of Ultra Violet Imaging Telescope (UVIT) through Numerical Simulations

    Full text link
    Ultra-Violet Imaging Telescope (UVIT) is one of the five payloads aboard the Indian Space Research Organization (ISRO)'s ASTROSAT space mission. The science objectives of UVIT are broad, extending from individual hot stars, star-forming regions to active galactic nuclei. Imaging performance of UVIT would depend on several factors in addition to the optics, e.g. resolution of the detectors, Satellite Drift and Jitter, image frame acquisition rate, sky background, source intensity etc. The use of intensified CMOS-imager based photon counting detectors in UVIT put their own complexity over reconstruction of the images. All these factors could lead to several systematic effects in the reconstructed images. A study has been done through numerical simulations with artificial point sources and archival image of a galaxy from GALEX data archive, to explore the effects of all the above mentioned parameters on the reconstructed images. In particular the issues of angular resolution, photometric accuracy and photometric-nonlinearity associated with the intensified CMOS-imager based photon counting detectors have been investigated. The photon events in image frames are detected by three different centroid algorithms with some energy thresholds. Our results show that in presence of bright sources, reconstructed images from UVIT would suffer from photometric distortion in a complex way and the presence of overlapping photon events could lead to complex patterns near the bright sources. Further the angular resolution, photometric accuracy and distortion would depend on the values of various thresholds chosen to detect photon events.Comment: Submitted to PASP, 16 Pages, 9 figure

    An Artificial Intelligence Approach to Detect Visual Field Progression in Glaucoma Based on Spatial Pattern Analysis.

    Get PDF
    Purpose: To detect visual field (VF) progression by analyzing spatial pattern changes. Methods: We selected 12,217 eyes from 7360 patients with at least five reliable 24-2 VFs and 5 years of follow-up with an interval of at least 6 months. VFs were decomposed into 16 archetype patterns previously derived by artificial intelligence techniques. Linear regressions were applied to the 16 archetype weights of VF series over time. We defined progression as the decrease rate of the normal archetype or any increase rate of the 15 VF defect archetypes to be outside normal limits. The archetype method was compared with mean deviation (MD) slope, Advanced Glaucoma Intervention Study (AGIS) scoring, Collaborative Initial Glaucoma Treatment Study (CIGTS) scoring, and the permutation of pointwise linear regression (PoPLR), and was validated by a subset of VFs assessed by three glaucoma specialists. Results: In the method development cohort of 11,817 eyes, the archetype method agreed more with MD slope (kappa: 0.37) and PoPLR (0.33) than AGIS (0.12) and CIGTS (0.22). The most frequently progressed patterns included decreased normal pattern (63.7%), and increased nasal steps (16.4%), altitudinal loss (15.9%), superior-peripheral defect (12.1%), paracentral/central defects (10.5%), and near total loss (10.4%). In the clinical validation cohort of 397 eyes with 27.5% of confirmed progression, the agreement (kappa) and accuracy (mean of hit rate and correct rejection rate) of the archetype method (0.51 and 0.77) significantly (P \u3c 0.001 for all) outperformed AGIS (0.06 and 0.52), CIGTS (0.24 and 0.59), MD slope (0.21 and 0.59), and PoPLR (0.26 and 0.60). Conclusions: The archetype method can inform clinicians of VF progression patterns

    Analysis of ICP variants for the registration of partially overlapping time-of-flight range images

    Get PDF
    The iterative closest point (ICP) algorithm is one of the most commonly used methods for registering partially overlapping range images. Nevertheless, this algorithm was not originally designed for this task, and many variants have been proposed in an effort to improve its prociency. The relatively new full-field amplitude-modulated time-of-flight range imaging cameras present further complications to registration in the form of measurement errors due to mixed and scattered light. This paper investigates the effectiveness of the most common ICP variants applied to range image data acquired from full-field range imaging cameras. The original ICP algorithm combined with boundary rejection performed the same as or better than the majority of variants tested. In fact, many of these variants proved to decrease the registration alignment
    • 

    corecore