20,702 research outputs found

    Learning fuzzy measures for aggregation in fuzzy rule-based models

    Get PDF
    ComunicaciĂłn presentada al 15th International Conference on Modeling Decisions for Artificial Intelligence, MDAI 2018 (15 - 18 october 2018).Fuzzy measures are used to express background knowledge of the information sources. In fuzzy rule-based models, the rule confidence gives an important information about the final classes and their relevance. This work proposes to use fuzzy measures and integrals to combine rules confidences when making a decision. A Sugeno $$\lambda $$ -measure and a distorted probability have been used in this process. A clinical decision support system (CDSS) has been built by applying this approach to a medical dataset. Then we use our system to estimate the risk of developing diabetic retinopathy. We show performance results comparing our system with others in the literature.This work is supported by the URV grant 2017PFR-URV-B2-60, and by the Spanish research projects no: PI12/01535 and PI15/01150 for (Instituto de Salud Carlos III and FEDER funds). Mr. Saleh has a Pre-doctoral grant (FI 2017) provided by the Catalan government and an Erasmus+ travel grant by URV. Prof. Bustince acknowledges the support of Spanish project TIN2016-77356-P

    Enabling Explainable Fusion in Deep Learning with Fuzzy Integral Neural Networks

    Full text link
    Information fusion is an essential part of numerous engineering systems and biological functions, e.g., human cognition. Fusion occurs at many levels, ranging from the low-level combination of signals to the high-level aggregation of heterogeneous decision-making processes. While the last decade has witnessed an explosion of research in deep learning, fusion in neural networks has not observed the same revolution. Specifically, most neural fusion approaches are ad hoc, are not understood, are distributed versus localized, and/or explainability is low (if present at all). Herein, we prove that the fuzzy Choquet integral (ChI), a powerful nonlinear aggregation function, can be represented as a multi-layer network, referred to hereafter as ChIMP. We also put forth an improved ChIMP (iChIMP) that leads to a stochastic gradient descent-based optimization in light of the exponential number of ChI inequality constraints. An additional benefit of ChIMP/iChIMP is that it enables eXplainable AI (XAI). Synthetic validation experiments are provided and iChIMP is applied to the fusion of a set of heterogeneous architecture deep models in remote sensing. We show an improvement in model accuracy and our previously established XAI indices shed light on the quality of our data, model, and its decisions.Comment: IEEE Transactions on Fuzzy System

    Fuzzy linear assignment problem: an approach to vehicle fleet deployment

    Get PDF
    This paper proposes and examines a new approach using fuzzy logic to vehicle fleet deployment. Fleet deployment is viewed as a fuzzy linear assignment problem. It assigns each travel request to an available service vehicle through solving a linear assignment matrix of defuzzied cost entries. Each cost entry indicates the cost value of a travel request that "fuzzily aggregates" multiple criteria in simple rules incorporating human dispatching expertise. The approach is examined via extensive simulations anchored in a representative scenario of taxi deployment, and compared to the conventional case of using only distances (each from the taxi position to the source point and finally destination point of a travel request) as cost entries. Discussion in the context of related work examines the performance and practicality of the proposed approach

    Evolving Large-Scale Data Stream Analytics based on Scalable PANFIS

    Full text link
    Many distributed machine learning frameworks have recently been built to speed up the large-scale data learning process. However, most distributed machine learning used in these frameworks still uses an offline algorithm model which cannot cope with the data stream problems. In fact, large-scale data are mostly generated by the non-stationary data stream where its pattern evolves over time. To address this problem, we propose a novel Evolving Large-scale Data Stream Analytics framework based on a Scalable Parsimonious Network based on Fuzzy Inference System (Scalable PANFIS), where the PANFIS evolving algorithm is distributed over the worker nodes in the cloud to learn large-scale data stream. Scalable PANFIS framework incorporates the active learning (AL) strategy and two model fusion methods. The AL accelerates the distributed learning process to generate an initial evolving large-scale data stream model (initial model), whereas the two model fusion methods aggregate an initial model to generate the final model. The final model represents the update of current large-scale data knowledge which can be used to infer future data. Extensive experiments on this framework are validated by measuring the accuracy and running time of four combinations of Scalable PANFIS and other Spark-based built in algorithms. The results indicate that Scalable PANFIS with AL improves the training time to be almost two times faster than Scalable PANFIS without AL. The results also show both rule merging and the voting mechanisms yield similar accuracy in general among Scalable PANFIS algorithms and they are generally better than Spark-based algorithms. In terms of running time, the Scalable PANFIS training time outperforms all Spark-based algorithms when classifying numerous benchmark datasets.Comment: 20 pages, 5 figure

    Synergy Modelling and Financial Valuation : the contribution of Fuzzy Integrals.

    Get PDF
    Les mĂ©thodes d’évaluation financiĂšre utilisent des opĂ©rateurs d’agrĂ©gation reposant sur les propriĂ©tĂ©s d’additivitĂ© (sommations, intĂ©grales de Lebesgue). De ce fait, elles occultent les phĂ©nomĂšnes de renforcement et de synergie (ou de redondance) qui peuvent exister entre les Ă©lĂ©ments d’un ensemble organisĂ©. C’est particuliĂšrement le cas en ce qui concerne le problĂšme d’évaluation financiĂšre du patrimoine d’une entreprise : en effet, en pratique, il est souvent mis en Ă©vidence une importante diffĂ©rence de valorisation entre l’approche « valeur de la somme des Ă©lĂ©ments » (privilĂ©giant le point de vue financier) et l’approche « somme de la valeur des diffĂ©rents Ă©lĂ©ments » (privilĂ©giant le point de vue comptable). Les possibilitĂ©s offertes par des opĂ©rateurs d’agrĂ©gation comme les intĂ©grales floues (Sugeno, Grabisch, Choquet) permettent, au plan thĂ©orique, de modĂ©liser l’effet de synergie. La prĂ©sente Ă©tude se propose de valider empiriquement les modalitĂ©s d’implĂ©mentation opĂ©rationnelle de ce modĂšle Ă  partir d’un Ă©chantillon d’entreprises cotĂ©es ayant fait l’objet d’une Ă©valuation lors d’une OPA.Financial valuation methods use additive aggregation operators. But a patrimony should be regarded as an organized set, and additivity makes it impossible for these aggregation operators to formalize such phenomena as synergy or mutual inhibition between the patrimony’s components. This paper considers the application of fuzzy measure and fuzzy integrals (Sugeno, Grabisch, Choquet) to financial valuation. More specifically, we show how integration with respect to a non additive measure can be used to handle positive or negative synergy in value construction.Fuzzy measure; Fuzzy integral; Aggregation operator; Synergy; Financial valuation;
    • 

    corecore