259 research outputs found

    Representation of maxitive measures: an overview

    Full text link
    Idempotent integration is an analogue of Lebesgue integration where σ\sigma-maxitive measures replace σ\sigma-additive measures. In addition to reviewing and unifying several Radon--Nikodym like theorems proven in the literature for the idempotent integral, we also prove new results of the same kind.Comment: 40 page

    Modelling fraud detection by attack trees and Choquet integral

    Get PDF
    Modelling an attack tree is basically a matter of associating a logical ÒndÓand a logical ÒrÓ but in most of real world applications related to fraud management the Ònd/orÓlogic is not adequate to effectively represent the relationship between a parent node and its children, most of all when information about attributes is associated to the nodes and the main problem to solve is how to promulgate attribute values up the tree through recursive aggregation operations occurring at the Ònd/orÓnodes. OWA-based aggregations have been introduced to generalize ÒndÓand ÒrÓoperators starting from the observation that in between the extremes Òor allÓ(and) and Òor anyÓ(or), terms (quantifiers) like ÒeveralÓ ÒostÓ ÒewÓ ÒomeÓ etc. can be introduced to represent the different weights associated to the nodes in the aggregation. The aggregation process taking place at an OWA node depends on the ordered position of the child nodes but it doesnÕ take care of the possible interactions between the nodes. In this paper, we propose to overcome this drawback introducing the Choquet integral whose distinguished feature is to be able to take into account the interaction between nodes. At first, the attack tree is valuated recursively through a bottom-up algorithm whose complexity is linear versus the number of nodes and exponential for every node. Then, the algorithm is extended assuming that the attribute values in the leaves are unimodal LR fuzzy numbers and the calculation of Choquet integral is carried out using the alpha-cuts.Fraud detection; attack tree; ordered weighted averaging (OWA) operator; Choquet integral; fuzzy numbers.

    Fuzzy measures and integrals in MCDA

    Get PDF
    This chapter aims at a unified presentation of various methods of MCDA based onfuzzy measures (capacity) and fuzzy integrals, essentially the Choquet andSugeno integral. A first section sets the position of the problem ofmulticriteria decision making, and describes the various possible scales ofmeasurement (difference, ratio, and ordinal). Then a whole section is devotedto each case in detail: after introducing necessary concepts, the methodologyis described, and the problem of the practical identification of fuzzy measuresis given. The important concept of interaction between criteria, central inthis chapter, is explained in details. It is shown how it leads to k-additivefuzzy measures. The case of bipolar scales leads to thegeneral model based on bi-capacities, encompassing usual models based oncapacities. A general definition of interaction for bipolar scales isintroduced. The case of ordinal scales leads to the use of Sugeno integral, andits symmetrized version when one considers symmetric ordinal scales. Apractical methodology for the identification of fuzzy measures in this contextis given. Lastly, we give a short description of some practical applications.Choquet integral; fuzzy measure; interaction; bi-capacities

    A decade of application of the Choquet and Sugeno integrals in multi-criteria decision aid

    Get PDF
    The main advances regarding the use of the Choquet and Sugeno integrals in multi-criteria decision aid over the last decade are reviewed. They concern mainly a bipolar extension of both the Choquet integral and the Sugeno integral, interesting particular submodels, new learning techniques, a better interpretation of the models and a better use of the Choquet integral in multi-criteria decision aid. Parallel to these theoretical works, the Choquet integral has been applied to many new fields, and several softwares and libraries dedicated to this model have been developed.Choquet integral, Sugeno integral, capacity, bipolarity, preferences

    Insights and Characterization of l1-norm Based Sparsity Learning of a Lexicographically Encoded Capacity Vector for the Choquet Integral

    Get PDF
    This thesis aims to simultaneously minimize function error and model complexity for data fusion via the Choquet integral (CI). The CI is a generator function, i.e., it is parametric and yields a wealth of aggregation operators based on the specifics of the underlying fuzzy measure. It is often the case that we desire to learn a fusion from data and the goal is to have the smallest possible sum of squared error between the trained model and a set of labels. However, we also desire to learn as “simple’’ of solutions as possible. Herein, L1-norm regularization of a lexicographically encoded capacity vector relative to the CI is explored. The impact of regularization is explored in terms of what capacities and aggregation operators it induces under different common and extreme scenarios. Synthetic experiments are provided in order to illustrate the propositions and concepts put forth

    Book Reviews

    Get PDF

    Data-informed fuzzy measures for fuzzy integration of intervals and fuzzy numbers

    Get PDF
    The fuzzy integral (FI) with respect to a fuzzy measure (FM) is a powerful means of aggregating information. The most popular FIs are the Choquet and Sugeno, and most research focuses on these two variants. The arena of the FM is much more populated, including numerically derived FMs such as the Sugeno λ-measure and decomposable measure, expert-defined FMs, and data-informed FMs. The drawback of numerically derived and expert-defined FMs is that one must know something about the relative values of the input sources. However, there are many problems where this information is unavailable, such as crowdsourcing. This paper focuses on data-informed FMs, or those FMs that are computed by an algorithm that analyzes some property of the input data itself, gleaning the importance of each input source by the data they provide. The original instantiation of a data-informed FM is the agreement FM, which assigns high confidence to combinations of sources that numerically agree with one another. This paper extends upon our previous work in datainformed FMs by proposing the uniqueness measure and additive measure of agreement for interval-valued evidence. We then extend data-informed FMs to fuzzy number (FN)-valued inputs. We demonstrate the proposed FMs by aggregating interval and FN evidence with the Choquet and Sugeno FIs for both synthetic and real-world data

    Computation of Choquet integral for finite sets: Notes on a ChatGPT-driven experience

    Get PDF
    The Choquet integral, credited to Gustave Choquet in 1954, initially found its roots in decision making under uncertainty following Schmeidler's pioneering work in this field. Surprisingly, it was not until the 1990s that this integral gained recognition in the realm of multi-criteria decision aid. Nowadays, the Choquet integral boasts numerous generalizations and serves as a focal point for intensive research and development across various domains. Here we share our journey of utilizing ChatGPT as a helpful assistant to delve into the computation of the discrete Choquet integral using Mathematica. Additionally, we have demonstrated our ChatGPT experience by crafting a Beamer presentation with its assistance. The ultimate aim of this exercise is to pave the way for the application of the discrete Choquet integral in the context of N-soft sets

    Neuro-inspired edge feature fusion using Choquet integrals

    Get PDF
    It is known that the human visual system performs a hierarchical information process in which early vision cues (or primitives) are fused in the visual cortex to compose complex shapes and descriptors. While different aspects of the process have been extensively studied, such as lens adaptation or feature detection, some other aspects, such as feature fusion, have been mostly left aside. In this work, we elaborate on the fusion of early vision primitives using generalizations of the Choquet integral, and novel aggregation operators that have been extensively studied in recent years. We propose to use generalizations of the Choquet integral to sensibly fuse elementary edge cues, in an attempt to model the behaviour of neurons in the early visual cortex. Our proposal leads to a fully-framed edge detection algorithm whose performance is put to the test in state-of-the-art edge detection datasets.The authors gratefully acknowledge the financial support of the Spanish Ministry of Science and Technology (project PID2019-108392GB-I00 (AEI/10.13039/501100011033), the Research Services of Universidad Pública de Navarra, CNPq (307781/2016-0, 301618/2019-4), FAPERGS (19/2551-0001660) and PNPD/CAPES (464880/2019-00)
    corecore