79,957 research outputs found

    High Granular Operator Spaces, and Less-Contaminated General Rough Mereologies

    Full text link
    Granular operator spaces and variants had been introduced and used in theoretical investigations on the foundations of general rough sets by the present author over the last few years. In this research, higher order versions of these are presented uniformly as partial algebraic systems. They are also adapted for practical applications when the data is representable by data table-like structures according to a minimalist schema for avoiding contamination. Issues relating to valuations used in information systems or tables are also addressed. The concept of contamination introduced and studied by the present author across a number of her papers, concerns mixing up of information across semantic domains (or domains of discourse). Rough inclusion functions (\textsf{RIF}s), variants, and numeric functions often have a direct or indirect role in contaminating algorithms. Some solutions that seek to replace or avoid them have been proposed and investigated by the present author in some of her earlier papers. Because multiple kinds of solution are of interest to the contamination problem, granular generalizations of RIFs are proposed, and investigated. Interesting representation results are proved and a core algebraic strategy for generalizing Skowron-Polkowski style of rough mereology (though for a very different purpose) is formulated. A number of examples have been added to illustrate key parts of the proposal in higher order variants of granular operator spaces. Further algorithms grounded in mereological nearness, suited for decision-making in human-machine interaction contexts, are proposed by the present author. Applications of granular \textsf{RIF}s to partial/soft solutions of the inverse problem are also invented in this paper.Comment: Research paper: Preprint: Final versio

    Algebraic, Topological, and Mereological Foundations of Existential Granules

    Full text link
    In this research, new concepts of existential granules that determine themselves are invented, and are characterized from algebraic, topological, and mereological perspectives. Existential granules are those that determine themselves initially, and interact with their environment subsequently. Examples of the concept, such as those of granular balls, though inadequately defined, algorithmically established, and insufficiently theorized in earlier works by others, are already used in applications of rough sets and soft computing. It is shown that they fit into multiple theoretical frameworks (axiomatic, adaptive, and others) of granular computing. The characterization is intended for algorithm development, application to classification problems and possible mathematical foundations of generalizations of the approach. Additionally, many open problems are posed and directions provided.Comment: 15 Pages. Accepted IJCRS 202

    Dialectics of Counting and the Mathematics of Vagueness

    Full text link
    New concepts of rough natural number systems are introduced in this research paper from both formal and less formal perspectives. These are used to improve most rough set-theoretical measures in general Rough Set theory (\textsf{RST}) and to represent rough semantics. The foundations of the theory also rely upon the axiomatic approach to granularity for all types of general \textsf{RST} recently developed by the present author. The latter theory is expanded upon in this paper. It is also shown that algebraic semantics of classical \textsf{RST} can be obtained from the developed dialectical counting procedures. Fuzzy set theory is also shown to be representable in purely granule-theoretic terms in the general perspective of solving the contamination problem that pervades this research paper. All this constitutes a radically different approach to the mathematics of vague phenomena and suggests new directions for a more realistic extension of the foundations of mathematics of vagueness from both foundational and application points of view. Algebras corresponding to a concept of \emph{rough naturals} are also studied and variants are characterised in the penultimate section.Comment: This paper includes my axiomatic approach to granules. arXiv admin note: substantial text overlap with arXiv:1102.255

    A Novel Rough Set Reduct Algorithm for Medical Domain Based on Bee Colony Optimization

    Full text link
    Feature selection refers to the problem of selecting relevant features which produce the most predictive outcome. In particular, feature selection task is involved in datasets containing huge number of features. Rough set theory has been one of the most successful methods used for feature selection. However, this method is still not able to find optimal subsets. This paper proposes a new feature selection method based on Rough set theory hybrid with Bee Colony Optimization (BCO) in an attempt to combat this. This proposed work is applied in the medical domain to find the minimal reducts and experimentally compared with the Quick Reduct, Entropy Based Reduct, and other hybrid Rough Set methods such as Genetic Algorithm (GA), Ant Colony Optimization (ACO) and Particle Swarm Optimization (PSO).Comment: IEEE Publication Format, https://sites.google.com/site/journalofcomputing

    A Framework for Intelligent Medical Diagnosis using Rough Set with Formal Concept Analysis

    Full text link
    Medical diagnosis process vary in the degree to which they attempt to deal with different complicating aspects of diagnosis such as relative importance of symptoms, varied symptom pattern and the relation between diseases them selves. Based on decision theory, in the past many mathematical models such as crisp set, probability distribution, fuzzy set, intuitionistic fuzzy set were developed to deal with complicating aspects of diagnosis. But, many such models are failed to include important aspects of the expert decisions. Therefore, an effort has been made to process inconsistencies in data being considered by Pawlak with the introduction of rough set theory. Though rough set has major advantages over the other methods, but it generates too many rules that create many difficulties while taking decisions. Therefore, it is essential to minimize the decision rules. In this paper, we use two processes such as pre process and post process to mine suitable rules and to explore the relationship among the attributes. In pre process we use rough set theory to mine suitable rules, whereas in post process we use formal concept analysis from these suitable rules to explore better knowledge and most important factors affecting the decision making.Comment: 22 page

    Rough Set Theory for Real Estate Appraisal: An Application to Directional District of Naples

    Get PDF
    This paper proposes an application of Rough Set Theory (RST) to the real estate field, in order to highlight its operational potentialities for mass appraisal purposes. RST allows one to solve the appraisal of real estate units regardless of the deterministic relationship between characteristics that contribute to the formation of the property market price and the same real estate prices. RST was applied to a real estate sample (office units located in Directional District of Naples) and was also integrated with a functional extension so-called Valued Tolerance Relation (VTR) in order to improve its flexibility. A multiple regression analysis (MRA) was developed on the same real estate sample with the aim to compare RST and MRA results. The case study is followed by a brief discussion on basic theoretical connotations of this methodology

    Rough sets and three-valued structures

    Full text link
    In recent years, many papers have been published showing relationships between rough sets and some lattice theoretical structures. We present here some strong relations between rough sets and three-valued {\L}ukasiewicz algebras.Comment: 10 page

    Unuploaded experiments have no result

    Full text link
    The aim of this note is to attract once again attention of the quantum community to statistical analysis of data which was reported as violating Bell's inequality. This analysis suffers of a number of problems. And the main problem is that rough data is practically unavailable. However, experiments which are not followed by the open access to the rough data have to be considered as with no result. The absence of rough data generates a variety of problems in statistical interpretation of the results of Bell's type experiment. One may hope that this note would stimulate experimenters to create the open access data-base for, e.g., Bell tests. Unfortunately, recently announced experimental loophole-free violation of a Bell inequality using entangled electron spins separated by 1.3 km was not supported by open-access data. Therefore in accordance with our approach "it has no result." The promising data after publication is, of course, a step towards fair analysis quantum experiments. May be this is a consequence of appearance of this preprint, v1. But there are a few questions which would be interesting to clarify before publication (and which we shall discuss in this note).Comment: comments on the recently announced "experimental loophole-free violation of a Bell inequality using entangled electron spins separated by 1.3 km.

    Learning Fuzzy {\beta}-Certain and {\beta}-Possible rules from incomplete quantitative data by rough sets

    Full text link
    The rough-set theory proposed by Pawlak, has been widely used in dealing with data classification problems. The original rough-set model is, however, quite sensitive to noisy data. Tzung thus proposed deals with the problem of producing a set of fuzzy certain and fuzzy possible rules from quantitative data with a predefined tolerance degree of uncertainty and misclassification. This model allowed, which combines the variable precision rough-set model and the fuzzy set theory, is thus proposed to solve this problem. This paper thus deals with the problem of producing a set of fuzzy certain and fuzzy possible rules from incomplete quantitative data with a predefined tolerance degree of uncertainty and misclassification. A new method, incomplete quantitative data for rough-set model and the fuzzy set theory, is thus proposed to solve this problem. It first transforms each quantitative value into a fuzzy set of linguistic terms using membership functions and then finding incomplete quantitative data with lower and the fuzzy upper approximations. It second calculates the fuzzy {\beta}-lower and the fuzzy {\beta}-upper approximations. The certain and possible rules are then generated based on these fuzzy approximations. These rules can then be used to classify unknown objects.Comment: hi thanks for attentio

    Covering matroid

    Full text link
    In this paper, we propose a new type of matroids, namely covering matroids, and investigate the connections with the second type of covering-based rough sets and some existing special matroids. Firstly, as an extension of partitions, coverings are more natural combinatorial objects and can sometimes be more efficient to deal with problems in the real world. Through extending partitions to coverings, we propose a new type of matroids called covering matroids and prove them to be an extension of partition matroids. Secondly, since some researchers have successfully applied partition matroids to classical rough sets, we study the relationships between covering matroids and covering-based rough sets which are an extension of classical rough sets. Thirdly, in matroid theory, there are many special matroids, such as transversal matroids, partition matroids, 2-circuit matroid and partition-circuit matroids. The relationships among several special matroids and covering matroids are studied.Comment: 15 page
    • …
    corecore