34,509 research outputs found

    A Comparison of pattern classification techniques for orienting chest X-rays

    Get PDF
    The problem of orienting digital images of chest x-rays, which were captured at some multiple of 90 degrees from the true orientation, is a typical pattern classification problem. In this case, the solution to the problem must assign an instance of a digital image to one of four classes, where each class corresponds to one of the four possible orientations. A large number of techniques are available for developing a pattern classifier. Some of these techniques are characterized by independent variables whose values are difficult to relate back to the problem being solved. If a technique is highly sensitive to the values of these variables, the lack of a rigorous way of defining them can be a significant disadvantage to the inexperienced researcher. This thesis presents experiments by the author to solve the chest x-ray orientation problem using four different pattern classification techniques: genetic programming, an artificial neural network trained with back propagation, a probabilistic neural network, and a simple linear classifier. In addition, the author will demonstrate that an understanding of the design of a feature set may allow a programmer to develop a traditional program which does an adequate job of solving the classification problem. Comparisons of the different techniques will be based not only on their success at solving the problem, but also on the time required to find an acceptable solution and the degree to which each technique is sensitive to the values of the variables which characterize it. The thesis demonstrates that all of the techniques can be used to derive very accurate chest x-ray orientation classifiers. While it is dangerous to generalize the results of these experiments to pattern classification problems in general, the author will argue that the magnitude of the differences in performance between the different techniques minimizes this danger. In particular, the experiments suggest that the linear classifier is so computationally inexpensive that it is always worth trying, unless there is a priori knowledge that it will fail. The experiments also suggest that genetic programming is much more computationally expensive than are the linear classifier, artificial neural network, and probabilistic neural network techniques. Of the four conventional pattern classification techniques which were examined, it will be shown that the artificial neural network produced the most accurate classifiers for the x-ray orientation problem. In addition, the results of a number of trials suggest that the final accuracy of the classifier is relatively insensitive to the values of the parameters which characterize this technique, making it an appropriate choice for the inexperienced researcher. With respect to the ability of the resulting classifier to accurately orient sample x-rays which were not included in the training set, the artificial neural network performed well, when compared to the other techniques. Although the classifiers produced by the genetic programming technique were significantly more expensive to construct and were slightly less accurate than the best artificial neural networks, the results of genetic programming experiments can provide insights into the problem being studied, which would be difficult to discern from the classifiers produced by the other techniques. For example, one of the classifiers which was produced by genetic programming uses only eight of the twenty feature values extracted from the sample x-ray. Not only does this reduce the cost of extracting the feature values from an unknown sample, but the classifier itself would be much more efficient to evaluate than the classifiers produced by any of the other techniques

    A generic optimising feature extraction method using multiobjective genetic programming

    Get PDF
    In this paper, we present a generic, optimising feature extraction method using multiobjective genetic programming. We re-examine the feature extraction problem and show that effective feature extraction can significantly enhance the performance of pattern recognition systems with simple classifiers. A framework is presented to evolve optimised feature extractors that transform an input pattern space into a decision space in which maximal class separability is obtained. We have applied this method to real world datasets from the UCI Machine Learning and StatLog databases to verify our approach and compare our proposed method with other reported results. We conclude that our algorithm is able to produce classifiers of superior (or equivalent) performance to the conventional classifiers examined, suggesting removal of the need to exhaustively evaluate a large family of conventional classifiers on any new problem. (C) 2010 Elsevier B.V. All rights reserved

    Implicitly Constrained Semi-Supervised Least Squares Classification

    Full text link
    We introduce a novel semi-supervised version of the least squares classifier. This implicitly constrained least squares (ICLS) classifier minimizes the squared loss on the labeled data among the set of parameters implied by all possible labelings of the unlabeled data. Unlike other discriminative semi-supervised methods, our approach does not introduce explicit additional assumptions into the objective function, but leverages implicit assumptions already present in the choice of the supervised least squares classifier. We show this approach can be formulated as a quadratic programming problem and its solution can be found using a simple gradient descent procedure. We prove that, in a certain way, our method never leads to performance worse than the supervised classifier. Experimental results corroborate this theoretical result in the multidimensional case on benchmark datasets, also in terms of the error rate.Comment: 12 pages, 2 figures, 1 table. The Fourteenth International Symposium on Intelligent Data Analysis (2015), Saint-Etienne, Franc
    • …
    corecore