78 research outputs found

    Analysis of Fuzzy Logic Models

    Get PDF

    The legacy of 50 years of fuzzy sets: A discussion

    Get PDF
    International audienceThis note provides a brief overview of the main ideas and notions underlying fifty years of research in fuzzy set and possibility theory, two important settings introduced by L.A. Zadeh for representing sets with unsharp boundaries and uncertainty induced by granules of information expressed with words. The discussion is organized on the basis of three potential understanding of the grades of membership to a fuzzy set, depending on what the fuzzy set intends to represent: a group of elements with borderline members, a plausibility distribution, or a preference profile. It also questions the motivations for some existing generalized fuzzy sets. This note clearly reflects the shared personal views of its authors

    Data-informed fuzzy measures for fuzzy integration of intervals and fuzzy numbers

    Get PDF
    The fuzzy integral (FI) with respect to a fuzzy measure (FM) is a powerful means of aggregating information. The most popular FIs are the Choquet and Sugeno, and most research focuses on these two variants. The arena of the FM is much more populated, including numerically derived FMs such as the Sugeno λ-measure and decomposable measure, expert-defined FMs, and data-informed FMs. The drawback of numerically derived and expert-defined FMs is that one must know something about the relative values of the input sources. However, there are many problems where this information is unavailable, such as crowdsourcing. This paper focuses on data-informed FMs, or those FMs that are computed by an algorithm that analyzes some property of the input data itself, gleaning the importance of each input source by the data they provide. The original instantiation of a data-informed FM is the agreement FM, which assigns high confidence to combinations of sources that numerically agree with one another. This paper extends upon our previous work in datainformed FMs by proposing the uniqueness measure and additive measure of agreement for interval-valued evidence. We then extend data-informed FMs to fuzzy number (FN)-valued inputs. We demonstrate the proposed FMs by aggregating interval and FN evidence with the Choquet and Sugeno FIs for both synthetic and real-world data

    Eventology versus contemporary theories of uncertainty

    Get PDF
    The development of probability theory together with the Bayesian approach in the three last centuries is caused by two factors: the variability of the physical phenomena and partial ignorance about them. As now it is standard to believe [Dubois, 2007], the nature of these key factors is so various, that their descriptions are required special uncertainty theories, which differ from the probability theory and the Bayesian credo, and provide a better account of the various facets of uncertainty by putting together probabilistic and set-valued representations of information to catch a distinction between variability and ignorance. Eventology [Vorobyev, 2007], a new direction of probability theory and philosophy, offers the original event approach to the description of variability and ignorance, entering an agent, together with his/her beliefs, directly in the frameworks of scientific research in the form of eventological distribution of his/her own events. This allows eventology, by putting together probabilistic and set-event representation of information and philosophical concept of event as co-being [Bakhtin, 1920], to provide a unified strong account of various aspects of uncertainty catching distinction between variability and ignorance and opening an opportunity to define imprecise probability as a probability of imprecise event in the mathematical frameworks of Kolmogorov's probability theory [Kolmogorov, 1933].uncertainty, probability, event, co-being, eventology, imprecise event

    Similarity between interval-valued fuzzy sets taking into account the width of the intervals and admissible orders

    Get PDF
    In this work we study a new class of similarity measures between interval-valued fuzzy sets. The novelty of our approach lays, firstly, on the fact that we develop all the notions with respect to total orders of intervals; and secondly, on that we consider the width of intervals so that the uncertainty of the output is strongly related to the uncertainty of the input. For constructing the new interval-valued similarity, interval valued aggregation functions and interval-valued restricted equivalence functions which take into account the width of the intervals are needed, so we firstly study these functions, both in line with the two above stated features. Finally, we provide an illustrative example which makes use of an interval-valued similarity measure in stereo image matching and we show that the results obtained with the proposed interval-valued similarity measures improve numerically (according to the most widely used measures in the literature) the results obtained with interval valued similarity measures which do not consider the width of the intervals

    Fitting aggregation operators to data

    Full text link
    Theoretical advances in modelling aggregation of information produced a wide range of aggregation operators, applicable to almost every practical problem. The most important classes of aggregation operators include triangular norms, uninorms, generalised means and OWA operators.With such a variety, an important practical problem has emerged: how to fit the parameters/ weights of these families of aggregation operators to observed data? How to estimate quantitatively whether a given class of operators is suitable as a model in a given practical setting? Aggregation operators are rather special classes of functions, and thus they require specialised regression techniques, which would enforce important theoretical properties, like commutativity or associativity. My presentation will address this issue in detail, and will discuss various regression methods applicable specifically to t-norms, uninorms and generalised means. I will also demonstrate software implementing these regression techniques, which would allow practitioners to paste their data and obtain optimal parameters of the chosen family of operators.<br /

    A Dempster-Shafer theory inspired logic.

    Get PDF
    Issues of formalising and interpreting epistemic uncertainty have always played a prominent role in Artificial Intelligence. The Dempster-Shafer (DS) theory of partial beliefs is one of the most-well known formalisms to address the partial knowledge. Similarly to the DS theory, which is a generalisation of the classical probability theory, fuzzy logic provides an alternative reasoning apparatus as compared to Boolean logic. Both theories are featured prominently within the Artificial Intelligence domain, but the unified framework accounting for all the aspects of imprecise knowledge is yet to be developed. Fuzzy logic apparatus is often used for reasoning based on vague information, and the beliefs are often processed with the aid of Boolean logic. The situation clearly calls for the development of a logic formalism targeted specifically for the needs of the theory of beliefs. Several frameworks exist based on interpreting epistemic uncertainty through an appropriately defined modal operator. There is an epistemic problem with this kind of frameworks: while addressing uncertain information, they also allow for non-constructive proofs, and in this sense the number of true statements within these frameworks is too large. In this work, it is argued that an inferential apparatus for the theory of beliefs should follow premises of Brouwer's intuitionism. A logic refuting tertium non daturìs constructed by defining a correspondence between the support functions representing beliefs in the DS theory and semantic models based on intuitionistic Kripke models with weighted nodes. Without addional constraints on the semantic models and without modal operators, the constructed logic is equivalent to the minimal intuitionistic logic. A number of possible constraints is considered resulting in additional axioms and making the proposed logic intermediate. Further analysis of the properties of the created framework shows that the approach preserves the Dempster-Shafer belief assignments and thus expresses modality through the belief assignments of the formulae within the developed logic

    Extension of the fuzzy integral for general fuzzy set-valued information

    Get PDF
    The fuzzy integral (FI) is an extremely flexible aggregation operator. It is used in numerous applications, such as image processing, multicriteria decision making, skeletal age-at-death estimation, and multisource (e.g., feature, algorithm, sensor, and confidence) fusion. To date, a few works have appeared on the topic of generalizing Sugeno's original real-valued integrand and fuzzy measure (FM) for the case of higher order uncertain information (both integrand and measure). For the most part, these extensions are motivated by, and are consistent with, Zadeh's extension principle (EP). Namely, existing extensions focus on fuzzy number (FN), i.e., convex and normal fuzzy set- (FS) valued integrands. Herein, we put forth a new definition, called the generalized FI (gFI), and efficient algorithm for calculation for FS-valued integrands. In addition, we compare the gFI, numerically and theoretically, with our non-EP-based FI extension called the nondirect FI (NDFI). Examples are investigated in the areas of skeletal age-at-death estimation in forensic anthropology and multisource fusion. These applications help demonstrate the need and benefit of the proposed work. In particular, we show there is not one supreme technique. Instead, multiple extensions are of benefit in different contexts and applications
    corecore