150 research outputs found

    Granular computing, rough entropy and object extraction

    Get PDF
    The problem of image object extraction in the framework of rough sets and granular computing is addressed. A measure called "rough entropy of image" is defined based on the concept of image granules. Its maximization results in minimization of roughness in both object and background regions; thereby determining the threshold of partitioning. Methods of selecting the appropriate granule size and efficient computation of rough entropy are described

    Some Results on Multigranulation Neutrosophic Rough Sets on a Single Domain

    Get PDF
    As a generalization of single value neutrosophic rough sets, the concept of multi-granulation neutrosophic rough sets was proposed by Bo et al., and some basic properties of the pessimistic (optimistic) multigranulation neutrosophic rough approximation operators were studied

    Attribute Equilibrium Dominance Reduction Accelerator (DCCAEDR) Based on Distributed Coevolutionary Cloud and Its Application in Medical Records

    Full text link
    © 2013 IEEE. Aimed at the tremendous challenge of attribute reduction for big data mining and knowledge discovery, we propose a new attribute equilibrium dominance reduction accelerator (DCCAEDR) based on the distributed coevolutionary cloud model. First, the framework of N-populations distributed coevolutionary MapReduce model is designed to divide the entire population into N subpopulations, sharing the reward of different subpopulations' solutions under a MapReduce cloud mechanism. Because the adaptive balancing between exploration and exploitation can be achieved in a better way, the reduction performance is guaranteed to be the same as those using the whole independent data set. Second, a novel Nash equilibrium dominance strategy of elitists under the N bounded rationality regions is adopted to assist the subpopulations necessary to attain the stable status of Nash equilibrium dominance. This further enhances the accelerator's robustness against complex noise on big data. Third, the approximation parallelism mechanism based on MapReduce is constructed to implement rule reduction by accelerating the computation of attribute equivalence classes. Consequently, the entire attribute reduction set with the equilibrium dominance solution can be achieved. Extensive simulation results have been used to illustrate the effectiveness and robustness of the proposed DCCAEDR accelerator for attribute reduction on big data. Furthermore, the DCCAEDR is applied to solve attribute reduction for traditional Chinese medical records and to segment cortical surfaces of the neonatal brain 3-D-MRI records, and the DCCAEDR shows the superior competitive results, when compared with the representative algorithms

    Rough Sets and Near Sets in Medical Imaging: A Review

    Full text link

    EXPLOITING HIGHER ORDER UNCERTAINTY IN IMAGE ANALYSIS

    Get PDF
    Soft computing is a group of methodologies that works synergistically to provide flexible information processing capability for handling real-life ambiguous situations. Its aim is to exploit the tolerance for imprecision, uncertainty, approximate reasoning, and partial truth in order to achieve tractability, robustness, and low-cost solutions. Soft computing methodologies (involving fuzzy sets, neural networks, genetic algorithms, and rough sets) have been successfully employed in various image processing tasks including image segmentation, enhancement and classification, both individually or in combination with other soft computing techniques. The reason of such success has its motivation in the fact that soft computing techniques provide a powerful tools to describe uncertainty, naturally embedded in images, which can be exploited in various image processing tasks. The main contribution of this thesis is to present tools for handling uncertainty by means of a rough-fuzzy framework for exploiting feature level uncertainty. The first contribution is the definition of a general framework based on the hybridization of rough and fuzzy sets, along with a new operator called RF-product, as an effective solution to some problems in image analysis. The second and third contributions are devoted to prove the effectiveness of the proposed framework, by presenting a compression method based on vector quantization and its compression capabilities and an HSV color image segmentation technique

    Taming Wild High Dimensional Text Data with a Fuzzy Lash

    Full text link
    The bag of words (BOW) represents a corpus in a matrix whose elements are the frequency of words. However, each row in the matrix is a very high-dimensional sparse vector. Dimension reduction (DR) is a popular method to address sparsity and high-dimensionality issues. Among different strategies to develop DR method, Unsupervised Feature Transformation (UFT) is a popular strategy to map all words on a new basis to represent BOW. The recent increase of text data and its challenges imply that DR area still needs new perspectives. Although a wide range of methods based on the UFT strategy has been developed, the fuzzy approach has not been considered for DR based on this strategy. This research investigates the application of fuzzy clustering as a DR method based on the UFT strategy to collapse BOW matrix to provide a lower-dimensional representation of documents instead of the words in a corpus. The quantitative evaluation shows that fuzzy clustering produces superior performance and features to Principal Components Analysis (PCA) and Singular Value Decomposition (SVD), two popular DR methods based on the UFT strategy

    Perceptual image analysis

    Get PDF
    The problem considered in this paper is one of extracting perceptually relevant information from groups of objects based on their descriptions. Object descriptions are qualitatively represented by feature-value vectors containing probe function values computed in a manner similar to feature extraction in pattern classification theory. The work presented here is a generalisation of a solution to extracting perceptual information from images using near sets theory which provides a framework for measuring the perceptual nearness of objects. Further, near set theory is used to define a perception-based approach to image analysis that is inspired by traditional mathematical morphology and an application of this methodology is given by way of segmentation evaluation. The contribution of this article is the introduction of a new method of unsupervised segmentation evaluation that is base on human perception rather than on properties of ideal segmentations as is normally the case.https://www.inderscience.com/info/inarticle.php?artid=3309

    Multialgebraic Systems in Information Granulation

    Get PDF
    In different fields a conception of granules is applied both as a group of elements defined by internal properties and as something inseparable whole reflecting external properties. Granular computing may be interpreted in terms of abstraction, generalization, clustering, levels of abstraction, levels of detail, and so on. We have proposed to use multialgebraic systems as a mathematical tool for synthesis and analysis of granules and granule structures. The theorem of necessary and sufficient conditions for multialgebraic systems existence has been proved

    Informational Paradigm, management of uncertainty and theoretical formalisms in the clustering framework: A review

    Get PDF
    Fifty years have gone by since the publication of the first paper on clustering based on fuzzy sets theory. In 1965, L.A. Zadeh had published “Fuzzy Sets” [335]. After only one year, the first effects of this seminal paper began to emerge, with the pioneering paper on clustering by Bellman, Kalaba, Zadeh [33], in which they proposed a prototypal of clustering algorithm based on the fuzzy sets theory
    • …
    corecore