86,014 research outputs found

    Image Segmentation using Rough Set based Fuzzy K-Means Algorithm

    Get PDF
    Image segmentation is critical for many computer vision and information retrieval systems and has received significant attention from industry and academia over last three decades Despite notable advances in the area there is no standard technique for selecting a segmentation algorithm to use in a particular application nor even is there an agreed upon means of comparing the performance of one method with another This paper explores Rough-Fuzzy K-means RFKM algorithm a new intelligent technique used to discover data dependencies data reduction approximate set classification and rule induction from image databases Rough sets offer an effective approach of managing uncertainties and also used for image segmentation feature identification dimensionality reduction and pattern classification The proposed algorithm is based on a modified K-means clustering using rough set theory RFKM for image segmentation which is further divided into two parts Primarily the cluster centers are determined and then in the next phase they are reduced using Rough set theory RST K-means clustering algorithm is then applied on the reduced and optimized set of cluster centers with the purpose of segmentation of the images The existing clustering algorithms require initialization of cluster centers whereas the proposed scheme does not require any such prior information to partition the exact regions Experimental results show that the proposed method perform well and improve the segmentation results in the vague areas of the imag

    Exploring the Boundary Region of Tolerance Rough Sets for Feature Selection

    Get PDF
    Of all of the challenges which face the effective application of computational intelli-gence technologies for pattern recognition, dataset dimensionality is undoubtedly one of the primary impediments. In order for pattern classifiers to be efficient, a dimensionality reduction stage is usually performed prior to classification. Much use has been made of Rough Set Theory for this purpose as it is completely data-driven and no other information is required; most other methods require some additional knowledge. However, traditional rough set-based methods in the literature are restricted to the requirement that all data must be discrete. It is therefore not possible to consider real-valued or noisy data. This is usually addressed by employing a discretisation method, which can result in information loss. This paper proposes a new approach based on the tolerance rough set model, which has the abil-ity to deal with real-valued data whilst simultaneously retaining dataset semantics. More significantly, this paper describes the underlying mechanism for this new approach to utilise the information contained within the boundary region or region of uncertainty. The use of this information can result in the discovery of more compact feature subsets and improved classification accuracy. These results are supported by an experimental evaluation which compares the proposed approach with a number of existing feature selection techniques. Key words: feature selection, attribute reduction, rough sets, classification

    Dealing with uncertain entities in ontology alignment using rough sets

    Get PDF
    This is the author's accepted manuscript. The final published article is available from the link below. Copyright @ 2012 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.Ontology alignment facilitates exchange of knowledge among heterogeneous data sources. Many approaches to ontology alignment use multiple similarity measures to map entities between ontologies. However, it remains a key challenge in dealing with uncertain entities for which the employed ontology alignment measures produce conflicting results on similarity of the mapped entities. This paper presents OARS, a rough-set based approach to ontology alignment which achieves a high degree of accuracy in situations where uncertainty arises because of the conflicting results generated by different similarity measures. OARS employs a combinational approach and considers both lexical and structural similarity measures. OARS is extensively evaluated with the benchmark ontologies of the ontology alignment evaluation initiative (OAEI) 2010, and performs best in the aspect of recall in comparison with a number of alignment systems while generating a comparable performance in precision
    • …
    corecore