4,518 research outputs found

    Decision-theoretic rough sets-based three-way approximations of interval-valued fuzzy sets

    Full text link
    In practical situations, interval-valued fuzzy sets are frequently encountered. In this paper, firstly, we present shadowed sets for interpreting and understanding interval fuzzy sets. We also provide an analytic solution to computing the pair of thresholds by searching for a balance of uncertainty in the framework of shadowed sets. Secondly, we construct errors-based three-way approximations of interval-valued fuzzy sets. We also provide an alternative decision-theoretic formulation for calculating the pair of thresholds by transforming interval-valued loss functions into single-valued loss functions, in which the required thresholds are computed by minimizing decision costs. Thirdly, we compute errors-based three-way approximations of interval-valued fuzzy sets by using interval-valued loss functions. Finally, we employ several examples to illustrate that how to take an action for an object with interval-valued membership grade by using interval-valued loss functions

    Decision-theoretic rough sets based on time-dependent loss function

    Full text link
    A fundamental notion of decision-theoretic rough sets is the concept of loss functions, which provides a powerful tool of calculating a pair of thresholds for making a decision with a minimum cost. In this paper, time-dependent loss functions which are variations of the time are of interest because such functions are frequently encountered in practical situations, we present the relationship between the pair of thresholds and loss functions satisfying time-dependent uniform distributions and normal processes in light of bayesian decision procedure. Subsequently, with the aid of bayesian decision procedure, we provide the relationship between the pair of thresholds and loss functions which are time-dependent interval sets and fuzzy numbers. Finally, we employ several examples to illustrate that how to calculate the thresholds for making a decision by using time-dependent loss functions-based decision-theoretic rough sets

    High Granular Operator Spaces, and Less-Contaminated General Rough Mereologies

    Full text link
    Granular operator spaces and variants had been introduced and used in theoretical investigations on the foundations of general rough sets by the present author over the last few years. In this research, higher order versions of these are presented uniformly as partial algebraic systems. They are also adapted for practical applications when the data is representable by data table-like structures according to a minimalist schema for avoiding contamination. Issues relating to valuations used in information systems or tables are also addressed. The concept of contamination introduced and studied by the present author across a number of her papers, concerns mixing up of information across semantic domains (or domains of discourse). Rough inclusion functions (\textsf{RIF}s), variants, and numeric functions often have a direct or indirect role in contaminating algorithms. Some solutions that seek to replace or avoid them have been proposed and investigated by the present author in some of her earlier papers. Because multiple kinds of solution are of interest to the contamination problem, granular generalizations of RIFs are proposed, and investigated. Interesting representation results are proved and a core algebraic strategy for generalizing Skowron-Polkowski style of rough mereology (though for a very different purpose) is formulated. A number of examples have been added to illustrate key parts of the proposal in higher order variants of granular operator spaces. Further algorithms grounded in mereological nearness, suited for decision-making in human-machine interaction contexts, are proposed by the present author. Applications of granular \textsf{RIF}s to partial/soft solutions of the inverse problem are also invented in this paper.Comment: Research paper: Preprint: Final versio

    Dialectics of Counting and the Mathematics of Vagueness

    Full text link
    New concepts of rough natural number systems are introduced in this research paper from both formal and less formal perspectives. These are used to improve most rough set-theoretical measures in general Rough Set theory (\textsf{RST}) and to represent rough semantics. The foundations of the theory also rely upon the axiomatic approach to granularity for all types of general \textsf{RST} recently developed by the present author. The latter theory is expanded upon in this paper. It is also shown that algebraic semantics of classical \textsf{RST} can be obtained from the developed dialectical counting procedures. Fuzzy set theory is also shown to be representable in purely granule-theoretic terms in the general perspective of solving the contamination problem that pervades this research paper. All this constitutes a radically different approach to the mathematics of vague phenomena and suggests new directions for a more realistic extension of the foundations of mathematics of vagueness from both foundational and application points of view. Algebras corresponding to a concept of \emph{rough naturals} are also studied and variants are characterised in the penultimate section.Comment: This paper includes my axiomatic approach to granules. arXiv admin note: substantial text overlap with arXiv:1102.255

    Related families-based attribute reduction of dynamic covering information systems with variations of object sets

    Full text link
    In practice, there are many dynamic covering decision information systems, and knowledge reduction of dynamic covering decision information systems is a significant challenge of covering-based rough sets. In this paper, we first study mechanisms of constructing attribute reducts for consistent covering decision information systems when adding objects using related families. We also employ examples to illustrate how to construct attribute reducts of consistent covering decision information systems when adding objects. Then we investigate mechanisms of constructing attribute reducts for consistent covering decision information systems when deleting objects using related families. We also employ examples to illustrate how to construct attribute reducts of consistent covering decision information systems when deleting objects. Finally, the experimental results illustrates that the related family-based methods are effective to perform attribute reduction of dynamic covering decision information systems when object sets are varying with time.Comment: arXiv admin note: substantial text overlap with arXiv:1711.0732

    An axiomatic approach to the roughness measure of rough sets

    Full text link
    In Pawlak's rough set theory, a set is approximated by a pair of lower and upper approximations. To measure numerically the roughness of an approximation, Pawlak introduced a quantitative measure of roughness by using the ratio of the cardinalities of the lower and upper approximations. Although the roughness measure is effective, it has the drawback of not being strictly monotonic with respect to the standard ordering on partitions. Recently, some improvements have been made by taking into account the granularity of partitions. In this paper, we approach the roughness measure in an axiomatic way. After axiomatically defining roughness measure and partition measure, we provide a unified construction of roughness measure, called strong Pawlak roughness measure, and then explore the properties of this measure. We show that the improved roughness measures in the literature are special instances of our strong Pawlak roughness measure and introduce three more strong Pawlak roughness measures as well. The advantage of our axiomatic approach is that some properties of a roughness measure follow immediately as soon as the measure satisfies the relevant axiomatic definition.Comment: to appear in the Fundamenta Informatica

    Related family-based attribute reduction of covering information systems when varying attribute sets

    Full text link
    In practical situations, there are many dynamic covering information systems with variations of attributes, but there are few studies on related family-based attribute reduction of dynamic covering information systems. In this paper, we first investigate updated mechanisms of constructing attribute reducts for consistent and inconsistent covering information systems when varying attribute sets by using related families. Then we employ examples to illustrate how to compute attribute reducts of dynamic covering information systems with variations of attribute sets. Finally, the experimental results illustrates that the related family-based methods are effective to perform attribute reduction of dynamic covering information systems when attribute sets are varying with time

    Weighting Scheme for a Pairwise Multi-label Classifier Based on the Fuzzy Confusion Matrix

    Full text link
    In this work we addressed the issue of applying a stochastic classifier and a local, fuzzy confusion matrix under the framework of multi-label classification. We proposed a novel solution to the problem of correcting label pairwise ensembles. The main step of the correction procedure is to compute classifier-specific competence and cross-competence measures, which estimates error pattern of the underlying classifier. At the fusion phase we employed two weighting approaches based on information theory. The classifier weights promote base classifiers which are the most susceptible to the correction based on the fuzzy confusion matrix. During the experimental study, the proposed approach was compared against two reference methods. The comparison was made in terms of six different quality criteria. The conducted experiments reveals that the proposed approach eliminates one of main drawbacks of the original FCM-based approach i.e. the original approach is vulnerable to the imbalanced class/label distribution. What is more, the obtained results shows that the introduced method achieves satisfying classification quality under all considered quality criteria. Additionally, the impact of fluctuations of data set characteristics is reduced.Comment: arXiv admin note: substantial text overlap with arXiv:1710.0872

    Dialectical Rough Sets, Parthood and Figures of Opposition-1

    Full text link
    In one perspective, the main theme of this research revolves around the inverse problem in the context of general rough sets that concerns the existence of rough basis for given approximations in a context. Granular operator spaces and variants were recently introduced by the present author as an optimal framework for anti-chain based algebraic semantics of general rough sets and the inverse problem. In the framework, various sub-types of crisp and non-crisp objects are identifiable that may be missed in more restrictive formalism. This is also because in the latter cases concepts of complementation and negation are taken for granted - while in reality they have a complicated dialectical basis. This motivates a general approach to dialectical rough sets building on previous work of the present author and figures of opposition. In this paper dialectical rough logics are invented from a semantic perspective, a concept of dialectical predicates is formalised, connection with dialetheias and glutty negation are established, parthood analyzed and studied from the viewpoint of classical and dialectical figures of opposition by the present author. Her methods become more geometrical and encompass parthood as a primary relation (as opposed to roughly equivalent objects) for algebraic semantics.Comment: 41 pages. The second part will appear soo

    Feature selection with test cost constraint

    Full text link
    Feature selection is an important preprocessing step in machine learning and data mining. In real-world applications, costs, including money, time and other resources, are required to acquire the features. In some cases, there is a test cost constraint due to limited resources. We shall deliberately select an informative and cheap feature subset for classification. This paper proposes the feature selection with test cost constraint problem for this issue. The new problem has a simple form while described as a constraint satisfaction problem (CSP). Backtracking is a general algorithm for CSP, and it is efficient in solving the new problem on medium-sized data. As the backtracking algorithm is not scalable to large datasets, a heuristic algorithm is also developed. Experimental results show that the heuristic algorithm can find the optimal solution in most cases. We also redefine some existing feature selection problems in rough sets, especially in decision-theoretic rough sets, from the viewpoint of CSP. These new definitions provide insight to some new research directions.Comment: 23 page
    • …
    corecore