160 research outputs found

    Reasoning with random sets: An agenda for the future

    Full text link
    In this paper, we discuss a potential agenda for future work in the theory of random sets and belief functions, touching upon a number of focal issues: the development of a fully-fledged theory of statistical reasoning with random sets, including the generalisation of logistic regression and of the classical laws of probability; the further development of the geometric approach to uncertainty, to include general random sets, a wider range of uncertainty measures and alternative geometric representations; the application of this new theory to high-impact areas such as climate change, machine learning and statistical learning theory.Comment: 94 pages, 17 figure

    Contribution on some construction methods for aggregation functions

    Get PDF
    In this paper, based on [14], we present some well established construction methods for aggregation functions as well as some new ones

    On the nature and decay of quantum relative entropy

    Get PDF
    Historically at the core of thermodynamics and information theory, entropy's use in quantum information extends to diverse topics including high-energy physics and operator algebras. Entropy can gauge the extent to which a quantum system departs from classicality, including by measuring entanglement and coherence, and in the form of entropic uncertainty relations between incompatible measurements. The theme of this dissertation is the quantum nature of entropy, and how exposure to a noisy environment limits and degrades non-classical features. An especially useful and general form of entropy is the quantum relative entropy, of which special cases include the von Neumann and Shannon entropies, coherent and mutual information, and a broad range of resource-theoretic measures. We use mathematical results on relative entropy to connect and unify features that distinguish quantum from classical information. We present generalizations of the strong subadditivity inequality and uncertainty-like entropy inequalities to subalgebras of operators on quantum systems for which usual independence assumptions fail. We construct new measures of non-classicality that simultaneously quantify entanglement and uncertainty, leading to a new resource theory of operations under which these forms of non-classicalty become interchangeable. Physically, our results deepen our understanding of how quantum entanglement relates to quantum uncertainty. We show how properties of entanglement limit the advantages of quantum superadditivity for information transmission through channels with high but detectable loss. Our method, based on the monogamy and faithfulness of the squashed entanglement, suggests a broader paradigm for bounding non-classical effects in lossy processes. We also propose an experiment to demonstrate superadditivity. Finally, we estimate decay rates in the form of modified logarithmic Sobolev inequalities for a variety of quantum channels, and in many cases we obtain the stronger, tensor-stable form known as a complete logarithmic Sobolev inequality. We compare these with our earlier results that bound relative entropy of the outputs of a particular class of quantum channels

    ISIPTA'07: Proceedings of the Fifth International Symposium on Imprecise Probability: Theories and Applications

    Get PDF
    B

    Fuzzy Logic

    Get PDF
    Fuzzy Logic is becoming an essential method of solving problems in all domains. It gives tremendous impact on the design of autonomous intelligent systems. The purpose of this book is to introduce Hybrid Algorithms, Techniques, and Implementations of Fuzzy Logic. The book consists of thirteen chapters highlighting models and principles of fuzzy logic and issues on its techniques and implementations. The intended readers of this book are engineers, researchers, and graduate students interested in fuzzy logic systems

    Fitting aggregation operators to data

    Full text link
    Theoretical advances in modelling aggregation of information produced a wide range of aggregation operators, applicable to almost every practical problem. The most important classes of aggregation operators include triangular norms, uninorms, generalised means and OWA operators.With such a variety, an important practical problem has emerged: how to fit the parameters/ weights of these families of aggregation operators to observed data? How to estimate quantitatively whether a given class of operators is suitable as a model in a given practical setting? Aggregation operators are rather special classes of functions, and thus they require specialised regression techniques, which would enforce important theoretical properties, like commutativity or associativity. My presentation will address this issue in detail, and will discuss various regression methods applicable specifically to t-norms, uninorms and generalised means. I will also demonstrate software implementing these regression techniques, which would allow practitioners to paste their data and obtain optimal parameters of the chosen family of operators.<br /

    Advances and Applications of Dezert-Smarandache Theory (DSmT) for Information Fusion (Collected works), Vol. 2

    Get PDF
    This second volume dedicated to Dezert-Smarandache Theory (DSmT) in Information Fusion brings in new fusion quantitative rules (such as the PCR1-6, where PCR5 for two sources does the most mathematically exact redistribution of conflicting masses to the non-empty sets in the fusion literature), qualitative fusion rules, and the Belief Conditioning Rule (BCR) which is different from the classical conditioning rule used by the fusion community working with the Mathematical Theory of Evidence. Other fusion rules are constructed based on T-norm and T-conorm (hence using fuzzy logic and fuzzy set in information fusion), or more general fusion rules based on N-norm and N-conorm (hence using neutrosophic logic and neutrosophic set in information fusion), and an attempt to unify the fusion rules and fusion theories. The known fusion rules are extended from the power set to the hyper-power set and comparison between rules are made on many examples. One defines the degree of intersection of two sets, degree of union of two sets, and degree of inclusion of two sets which all help in improving the all existing fusion rules as well as the credibility, plausibility, and communality functions. The book chapters are written by Frederic Dambreville, Milan Daniel, Jean Dezert, Pascal Djiknavorian, Dominic Grenier, Xinhan Huang, Pavlina Dimitrova Konstantinova, Xinde Li, Arnaud Martin, Christophe Osswald, Andrew Schumann, Tzvetan Atanasov Semerdjiev, Florentin Smarandache, Albena Tchamova, and Min Wang
    corecore