23,484 research outputs found

    Characterizations of discrete Sugeno integrals as polynomial functions over distributive lattices

    Get PDF
    We give several characterizations of discrete Sugeno integrals over bounded distributive lattices, as particular cases of lattice polynomial functions, that is, functions which can be represented in the language of bounded lattices using variables and constants. We also consider the subclass of term functions as well as the classes of symmetric polynomial functions and weighted minimum and maximum functions, and present their characterizations, accordingly. Moreover, we discuss normal form representations of these functions

    A temporal semantics for Nilpotent Minimum logic

    Full text link
    In [Ban97] a connection among rough sets (in particular, pre-rough algebras) and three-valued {\L}ukasiewicz logic {\L}3 is pointed out. In this paper we present a temporal like semantics for Nilpotent Minimum logic NM ([Fod95, EG01]), in which the logic of every instant is given by {\L}3: a completeness theorem will be shown. This is the prosecution of the work initiated in [AGM08] and [ABM09], in which the authors construct a temporal semantics for the many-valued logics of G\"odel ([G\"od32], [Dum59]) and Basic Logic ([H\'aj98]).Comment: 19 pages, 2 table

    Representation of maxitive measures: an overview

    Full text link
    Idempotent integration is an analogue of Lebesgue integration where σ\sigma-maxitive measures replace σ\sigma-additive measures. In addition to reviewing and unifying several Radon--Nikodym like theorems proven in the literature for the idempotent integral, we also prove new results of the same kind.Comment: 40 page

    How much of commonsense and legal reasoning is formalizable? A review of conceptual obstacles

    Get PDF
    Fifty years of effort in artificial intelligence (AI) and the formalization of legal reasoning have produced both successes and failures. Considerable success in organizing and displaying evidence and its interrelationships has been accompanied by failure to achieve the original ambition of AI as applied to law: fully automated legal decision-making. The obstacles to formalizing legal reasoning have proved to be the same ones that make the formalization of commonsense reasoning so difficult, and are most evident where legal reasoning has to meld with the vast web of ordinary human knowledge of the world. Underlying many of the problems is the mismatch between the discreteness of symbol manipulation and the continuous nature of imprecise natural language, of degrees of similarity and analogy, and of probabilities

    Typicality, graded membership, and vagueness

    Get PDF
    This paper addresses theoretical problems arising from the vagueness of language terms, and intuitions of the vagueness of the concepts to which they refer. It is argued that the central intuitions of prototype theory are sufficient to account for both typicality phenomena and psychological intuitions about degrees of membership in vaguely defined classes. The first section explains the importance of the relation between degrees of membership and typicality (or goodness of example) in conceptual categorization. The second and third section address arguments advanced by Osherson and Smith (1997), and Kamp and Partee (1995), that the two notions of degree of membership and typicality must relate to fundamentally different aspects of conceptual representations. A version of prototype theory—the Threshold Model—is proposed to counter these arguments and three possible solutions to the problems of logical selfcontradiction and tautology for vague categorizations are outlined. In the final section graded membership is related to the social construction of conceptual boundaries maintained through language use

    A kernel-based framework for learning graded relations from data

    Get PDF
    Driven by a large number of potential applications in areas like bioinformatics, information retrieval and social network analysis, the problem setting of inferring relations between pairs of data objects has recently been investigated quite intensively in the machine learning community. To this end, current approaches typically consider datasets containing crisp relations, so that standard classification methods can be adopted. However, relations between objects like similarities and preferences are often expressed in a graded manner in real-world applications. A general kernel-based framework for learning relations from data is introduced here. It extends existing approaches because both crisp and graded relations are considered, and it unifies existing approaches because different types of graded relations can be modeled, including symmetric and reciprocal relations. This framework establishes important links between recent developments in fuzzy set theory and machine learning. Its usefulness is demonstrated through various experiments on synthetic and real-world data.Comment: This work has been submitted to the IEEE for possible publication. Copyright may be transferred without notice, after which this version may no longer be accessibl

    Berezin-Toeplitz quantization for compact Kaehler manifolds. A Review of Results

    Get PDF
    This article is a review on Berezin-Toeplitz operator and Berezin-Toeplitz deformation quantization for compact quantizable Kaehler manifolds. The basic objects, concepts, and results are given. This concerns the correct semi-classical limit behaviour of the operator quantization, the unique Berezin-Toeplitz deformation quantization (star product), covariant and contravariant Berezin symbols, and Berezin transform. Other related objects and constructions are also discussed.Comment: 32 page
    • …
    corecore