7 research outputs found

    Advances and Applications of Dezert-Smarandache Theory (DSmT) for Information Fusion (Collected works), Vol. 2

    Get PDF
    This second volume dedicated to Dezert-Smarandache Theory (DSmT) in Information Fusion brings in new fusion quantitative rules (such as the PCR1-6, where PCR5 for two sources does the most mathematically exact redistribution of conļ¬‚icting masses to the non-empty sets in the fusion literature), qualitative fusion rules, and the Belief Conditioning Rule (BCR) which is diļ¬€erent from the classical conditioning rule used by the fusion community working with the Mathematical Theory of Evidence. Other fusion rules are constructed based on T-norm and T-conorm (hence using fuzzy logic and fuzzy set in information fusion), or more general fusion rules based on N-norm and N-conorm (hence using neutrosophic logic and neutrosophic set in information fusion), and an attempt to unify the fusion rules and fusion theories. The known fusion rules are extended from the power set to the hyper-power set and comparison between rules are made on many examples. One deļ¬nes the degree of intersection of two sets, degree of union of two sets, and degree of inclusion of two sets which all help in improving the all existing fusion rules as well as the credibility, plausibility, and communality functions. The book chapters are written by Frederic Dambreville, Milan Daniel, Jean Dezert, Pascal Djiknavorian, Dominic Grenier, Xinhan Huang, Pavlina Dimitrova Konstantinova, Xinde Li, Arnaud Martin, Christophe Osswald, Andrew Schumann, Tzvetan Atanasov Semerdjiev, Florentin Smarandache, Albena Tchamova, and Min Wang

    On the Methodological and Normative Foundations of Probabilism

    Get PDF
    This dissertation is an elaboration and defense of probabilism, the view that belief comes in various degrees of strength, and that the probability calculus provides coherence norms for these degrees of belief. Probabilism faces several well-known objections. For example, critics object that probabilismā€™s numerical representation of degrees of belief is too precise, and that its coherence norms are too demanding for real human agents to follow. While probabilists have developed several plausible responses to these objections, the compatibility among these responses is unclear. On this basis, I argue that probabilists must articulate unified methodological and normative foundations for their view, and I sketch the foundations of a probabilist modeling framework, the Comparative Confidence Framework (CCF). CCF characterizes probabilism primarily as an account of ideal degree of belief coherence. CCF provides a set of fundamentally qualitative and comparativeā€”rather than quantitativeā€”evaluative ideals for degree of belief coherence. By providing qualitative, comparative, evaluative coherence norms for degrees of belief, CCF avoids the aforementioned objections: that probabilismā€™s formal representation of degrees of belief is too precise, and that its norms are too demanding. CCF is a first step in the development of unified foundations for a wider subjectivist Bayesian theory of doxastic and pragmatic rationality

    Axiomatization of qualitative belief structure

    No full text
    corecore