19 research outputs found

    A breast cancer diagnosis system: a combined approach using rough sets and probabilistic neural networks

    Get PDF
    In this paper, we present a medical decision support system based on a hybrid approach utilising rough sets and a probabilistic neural network. We utilised the ability of rough sets to perform dimensionality reduction to eliminate redundant attributes from a biomedical dataset. We then utilised a probabilistic neural network to perform supervised classification. Our results indicate that rough sets was able to reduce the number of attributes in the dataset by 67% without sacrificing classification accuracy. Our classification accuracy results yielded results on the order of 93%

    Attribute extraction and classification using rough sets on a lymphoma dataset

    Get PDF

    Two new feature selection algorithms with rough sets theory

    Get PDF
    Rough Sets Theory has opened new trends for the development of the Incomplete Information Theory. Inside this one, the notion of reduct is a very significant one, but to obtain a reduct in a decision system is an expensive computing process although very important in data analysis and knowledge discovery. Because of this, it has been necessary the development of different variants to calculate reducts. The present work look into the utility that offers Rough Sets Model and Information Theory in feature selection and a new method is presented with the purpose of calculate a good reduct. This new method consists of a greedy algorithm that uses heuristics to work out a good reduct in acceptable times. In this paper we propose other method to find good reducts, this method combines elements of Genetic Algorithm with Estimation of Distribution Algorithms. The new methods are compared with others which are implemented inside Pattern Recognition and Ant Colony Optimization Algorithms and the results of the statistical tests are shown.IFIP International Conference on Artificial Intelligence in Theory and Practice - Knowledge Acquisition and Data MiningRed de Universidades con Carreras en Informática (RedUNCI

    The investigation of the Bayesian rough set model

    Get PDF
    AbstractThe original Rough Set model is concerned primarily with algebraic properties of approximately defined sets. The Variable Precision Rough Set (VPRS) model extends the basic rough set theory to incorporate probabilistic information. The article presents a non-parametric modification of the VPRS model called the Bayesian Rough Set (BRS) model, where the set approximations are defined by using the prior probability as a reference. Mathematical properties of BRS are investigated. It is shown that the quality of BRS models can be evaluated using probabilistic gain function, which is suitable for identification and elimination of redundant attributes

    The investigation of the Bayesian rough set model

    Get PDF
    AbstractThe original Rough Set model is concerned primarily with algebraic properties of approximately defined sets. The Variable Precision Rough Set (VPRS) model extends the basic rough set theory to incorporate probabilistic information. The article presents a non-parametric modification of the VPRS model called the Bayesian Rough Set (BRS) model, where the set approximations are defined by using the prior probability as a reference. Mathematical properties of BRS are investigated. It is shown that the quality of BRS models can be evaluated using probabilistic gain function, which is suitable for identification and elimination of redundant attributes

    Review of Traveling Salesman Problem for the genetic algorithms

    Get PDF
    Genetic Algorithms (GAs) are an evolutionary technique that uses the operators like mutation, crossover and the selection of the most fitted element as solution for problems optimization. The Traveling Salesman Problem (TSP) finds a path with minimal length, closed within a weighted graph in all its nodes and it visits each of them once. This problem is found in many real world applications and where a good solution might help. There are applied many methods for finding a solution for the TSP, but during this study GAs are used as an approximate method of TSP

    Data mining an EEG dataset with an emphasis on dimensionality reduction

    Get PDF
    The human brain is obviously a complex system, and exhibits rich spatiotemporal dynamics. Among the non-invasive techniques for probing human brain dynamics, electroencephalography (EEG) provides a direct measure of cortical activity with millisecond temporal resolution. Early attempts to analyse EEG data relied on visual inspection of EEG records. Since the introduction of EEG recordings, the volume of data generated from a study involving a single patient has increased exponentially. Therefore, automation based on pattern classification techniques have been applied with considerable success. In this study, a multi-step approach for the classification of EEG signal has been adopted. We have analysed sets of EEG time series recording from healthy volunteers with open eyes and intracranial EEG recordings from patients with epilepsy during ictal (seizure) periods. In the present work, we have employed a discrete wavelet transform to the EEG data in order to extract temporal information in the form of changes in the frequency domain over time - that is they are able to extract non-stationary signals embedded in the noisy background of the human brain. Principal components analysis (PCA) and rough sets have been used to reduce the data dimensionality. A multi-classifier scheme consists of LVQ2.1 neural networks have been developed for the classification task. The experimental results validated the proposed methodology

    Parallel Island Model for Attribute Reduction

    Full text link

    Parallel Genetic Algorithms with GPU Computing

    Get PDF
    Genetic algorithms (GAs) are powerful solutions to optimization problems arising from manufacturing and logistic fields. It helps to find better solutions for complex and difficult cases, which are hard to be solved by using strict optimization methods. Accelerating parallel GAs with GPU computing have received significant attention from both practitioners and researchers, ever since the emergence of GPU-CPU heterogeneous architectures. Designing a parallel algorithm on GPU is different fundamentally from designing one on CPU. On CPU architecture, typically data or tasks are distributed across tens of threads or processes, while on GPU architecture, more than hundreds of thousands of threads run. In order to fully utilize the computing power of GPUs, the design approaches and implementation strategies of parallel GAs should be re-probed. In the chapter, a concise overview of parallel GAs on GPU is given from the perspective of GPU architecture. The concept of parallelism granularity is redefined, the aspect of data layout is discussed on how it will affect the kernel performance, and the hierarchy of threads is examined on how threads are organized in the grid and blocks to expose sufficient parallelism to GPU. Some future research is discussed. A hybrid parallel model, based on the feature of GPU architecture, is suggested to build up efficient parallel GAs for hyper-scale problems
    corecore