1,962 research outputs found

    Linguistic quantifiers modeled by Sugeno integrals

    Get PDF
    Since quantifiers have the ability of summarizing the properties of a class of objects without enumerating them, linguistic quantification is a very important topic in the field of high level knowledge representation and reasoning. This paper introduces a new framework for modeling quantifiers in natural languages in which each linguistic quantifier is represented by a family of fuzzy measures, and the truth value of a quantified proposition is evaluated by using Sugeno's integral. This framework allows us to have some elegant logical properties of linguistic quantifiers. We compare carefully our new model of quantification and other approaches to linguistic quantifiers. A set of criteria for linguistic quantification was proposed in the previous literature. The relationship between these criteria and the results obtained in the present paper is clarified. Some simple applications of the Sugeno's integral semantics of quantifiers are presented. © 2006 Elsevier B.V. All rights reserved

    Using Fuzzy Linguistic Representations to Provide Explanatory Semantics for Data Warehouses

    Get PDF
    A data warehouse integrates large amounts of extracted and summarized data from multiple sources for direct querying and analysis. While it provides decision makers with easy access to such historical and aggregate data, the real meaning of the data has been ignored. For example, "whether a total sales amount 1,000 items indicates a good or bad sales performance" is still unclear. From the decision makers' point of view, the semantics rather than raw numbers which convey the meaning of the data is very important. In this paper, we explore the use of fuzzy technology to provide this semantics for the summarizations and aggregates developed in data warehousing systems. A three layered data warehouse semantic model, consisting of quantitative (numerical) summarization, qualitative (categorical) summarization, and quantifier summarization, is proposed for capturing and explicating the semantics of warehoused data. Based on the model, several algebraic operators are defined. We also extend the SQL language to allow for flexible queries against such enhanced data warehouses

    A comprehensive study of implicator-conjunctor based and noise-tolerant fuzzy rough sets: definitions, properties and robustness analysis

    Get PDF
    © 2014 Elsevier B.V. Both rough and fuzzy set theories offer interesting tools for dealing with imperfect data: while the former allows us to work with uncertain and incomplete information, the latter provides a formal setting for vague concepts. The two theories are highly compatible, and since the late 1980s many researchers have studied their hybridization. In this paper, we critically evaluate most relevant fuzzy rough set models proposed in the literature. To this end, we establish a formally correct and unified mathematical framework for them. Both implicator-conjunctor-based definitions and noise-tolerant models are studied. We evaluate these models on two different fronts: firstly, we discuss which properties of the original rough set model can be maintained and secondly, we examine how robust they are against both class and attribute noise. By highlighting the benefits and drawbacks of the different fuzzy rough set models, this study appears a necessary first step to propose and develop new models in future research.Lynn D’eer has been supported by the Ghent University Special Research Fund, Chris Cornelis was partially supported by the Spanish Ministry of Science and Technology under the project TIN2011-28488 and the Andalusian Research Plans P11-TIC-7765 and P10-TIC-6858, and by project PYR-2014-8 of the Genil Program of CEI BioTic GRANADA and Lluis Godo has been partially supported by the Spanish MINECO project EdeTRI TIN2012-39348-C02-01Peer Reviewe

    Immediate consequences operator on generalized quantifiers

    Get PDF
    The semantics of a multi-adjoint logic program is usually defined through the immediate consequences operator TP. However, the definition of the immediate consequences operator as the supremum of a set of values can provide some problem when imprecise datasets are considered, due to the strict feature of the supremum operator. Hence, based on the flexibility of generalized quantifiers to weaken the existential feature of the supremum operator, this paper presents a generalization of the immediate consequences operator with interesting properties for solving the aforementioned problem. © 2022 The Author(s

    Managing Interacting Criteria: Application to Environmental Evaluation Practices

    Get PDF
    The need for organizations to evaluate their environmental practices has been recently increasing. This fact has led to the development of many approaches to appraise such practices. In this paper, a novel decision model to evaluate company’s environmental practices is proposed to improve traditional evaluation process in different facets. Firstly, different reviewers’ collectives related to the company’s activity are taken into account in the process to increase company internal efficiency and external legitimacy. Secondly, following the standard ISO 14031, two general categories of environmental performance indicators, management and operational, are considered. Thirdly, since the assumption of independence among environmental indicators is rarely verified in environmental context, an aggregation operator to bear in mind the relationship among such indicators in the evaluation results is proposed. Finally, this new model integrates quantitative and qualitative information with different scales using a multi-granular linguistic model that allows to adapt diverse evaluation scales according to appraisers’ knowledge

    A fuzzy rule model for high level musical features on automated composition systems

    Get PDF
    Algorithmic composition systems are now well-understood. However, when they are used for specific tasks like creating material for a part of a piece, it is common to prefer, from all of its possible outputs, those exhibiting specific properties. Even though the number of valid outputs is huge, many times the selection is performed manually, either using expertise in the algorithmic model, by means of sampling techniques, or some times even by chance. Automations of this process have been done traditionally by using machine learning techniques. However, whether or not these techniques are really capable of capturing the human rationality, through which the selection is done, to a great degree remains as an open question. The present work discusses a possible approach, that combines expert’s opinion and a fuzzy methodology for rule extraction, to model high level features. An early implementation able to explore the universe of outputs of a particular algorithm by means of the extracted rules is discussed. The rules search for objects similar to those having a desired and pre-identified feature. In this sense, the model can be seen as a finder of objects with specific properties.Peer ReviewedPostprint (author's final draft
    • 

    corecore