17,864 research outputs found

    The Discursive Dilemma and Probabilistic Judgement Aggregation

    Get PDF
    Let S be a set of logically related propositions, and suppose a jury must decide the truth/falsehood of each member of S. A `judgement aggregation rule' (JAR) is a rule for combining the truth valuations on S from each juror into a collective truth valuation on S. Recent work has shown that there is no reasonable JAR which always yields a logically consistent collective truth valuation; this is referred to as the `Doctrinal Paradox' or the `Discursive Dilemma'. In this paper we will consider JARs which aggregate the subjective probability estimates of the jurors (rather than Boolean truth valuations) to produce a collective probability estimate for each proposition in S. We find that to properly aggregate these probability estimates, the JAR must also utilize information about the private information from which each juror generates her own probability estimate.discursive dilemma; doctrinal paradox; judgement aggregation; statistical opinion pool; interactive epistemology; common knowledge; epistemic democracy; deliberative democracy

    Toward a General Framework for Information Fusion

    Get PDF
    National audienceDepending on the representation setting, different combination rules have been proposed for fusing information from distinct sources. Moreover in each setting, different sets of axioms that combination rules should satisfy have been advocated, thus justifying the existence of alternative rules (usually motivated by situations where the behavior of other rules was found unsatisfactory). These sets of axioms are usually purely considered in their own settings, without in-depth analysis of common properties essential for all the settings. This paper introduces core properties that, once properly instantiated, are meaningful in different representation settings ranging from logic to imprecise probabilities. The following representation settings are especially considered: classical set representation, possibility theory, and evidence theory, the latter encompassing the two other ones as special cases. This unified discussion of combination rules across different settings is expected to provide a fresh look on some old but basic issues in information fusion

    07351 Abstracts Collection -- Formal Models of Belief Change in Rational Agents

    Get PDF
    From 26.08. to 30.08.2007, the Dagstuhl Seminar 07351 ``Formal Models of Belief Change in Rational Agents\u27\u27 was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available

    Introduction to Judgment Aggregation

    Get PDF
    This introduces the symposium on judgment aggregation. The theory of judgment ag­gregation asks how several individuals' judgments on some logically connected propo­sitions can be aggregated into consistent collective judgments. The aim of this intro­duction is to show how ideas from the familiar theory of preference aggregation can be extended to this more general case. We first translate a proof of Arrow's impos­sibility theorem into the new setting, so as to motivate some of the central concepts and conditions leading to analogous impossibilities, as discussed in the symposium. We then consider each of four possible escape-routes explored in the symposium.Judgment aggregation, Arrow's theorem, Escape routes

    Lack of Finite Characterizations for the Distance-based Revision

    Full text link
    Lehmann, Magidor, and Schlechta developed an approach to belief revision based on distances between any two valuations. Suppose we are given such a distance D. This defines an operator |D, called a distance operator, which transforms any two sets of valuations V and W into the set V |D W of all elements of W that are closest to V. This operator |D defines naturally the revision of K by A as the set of all formulas satisfied in M(K) |D M(A) (i.e. those models of A that are closest to the models of K). This constitutes a distance-based revision operator. Lehmann et al. characterized families of them using a loop condition of arbitrarily big size. An interesting question is whether this loop condition can be replaced by a finite one. Extending the results of Schlechta, we will provide elements of negative answer. In fact, we will show that for families of distance operators, there is no "normal" characterization. Approximatively, a normal characterization contains only finite and universally quantified conditions. These results have an interest of their own for they help to understand the limits of what is possible in this area. Now, we are quite confident that this work can be continued to show similar impossibility results for distance-based revision operators, which suggests that the big loop condition cannot be simplified

    A partial taxonomy of judgment aggregation rules, and their properties

    Get PDF
    The literature on judgment aggregation is moving from studying impossibility results regarding aggregation rules towards studying specific judgment aggregation rules. Here we give a structured list of most rules that have been proposed and studied recently in the literature, together with various properties of such rules. We first focus on the majority-preservation property, which generalizes Condorcet-consistency, and identify which of the rules satisfy it. We study the inclusion relationships that hold between the rules. Finally, we consider two forms of unanimity, monotonicity, homogeneity, and reinforcement, and we identify which of the rules satisfy these properties

    The Basic Principles of Uncertain Information Fusion. An organized review of merging rules in different representation frameworks

    Get PDF
    We propose and advocate basic principles for the fusion of incomplete or uncertain information items, that should apply regardless of the formalism adopted for representing pieces of information coming from several sources. This formalism can be based on sets, logic, partial orders, possibility theory, belief functions or imprecise probabilities. We propose a general notion of information item representing incomplete or uncertain information about the values of an entity of interest. It is supposed to rank such values in terms of relative plausibility, and explicitly point out impossible values. Basic issues affecting the results of the fusion process, such as relative information content and consistency of information items, as well as their mutual consistency, are discussed. For each representation setting, we present fusion rules that obey our principles, and compare them to postulates specific to the representation proposed in the past. In the crudest (Boolean) representation setting (using a set of possible values), we show that the understanding of the set in terms of most plausible values, or in terms of non-impossible ones matters for choosing a relevant fusion rule. Especially, in the latter case our principles justify the method of maximal consistent subsets, while the former is related to the fusion of logical bases. Then we consider several formal settings for incomplete or uncertain information items, where our postulates are instantiated: plausibility orderings, qualitative and quantitative possibility distributions, belief functions and convex sets of probabilities. The aim of this paper is to provide a unified picture of fusion rules across various uncertainty representation settings
    • 

    corecore