331 research outputs found
Finitely additive extensions of distribution functions and moment sequences: The coherent lower prevision approach
We study the information that a distribution function provides about the finitely additive probability measure inducing it. We show that in general there is an infinite number of finitely additive probabilities associated with the same distribution function. Secondly, we investigate the relationship between a distribution function and its given sequence of moments. We provide formulae for the sets of distribution functions, and finitely additive probabilities, associated with some moment sequence, and determine under which conditions the moments determine the distribution function uniquely. We show that all these problems can be addressed efficiently using the theory of coherent lower previsions
The Moment Problem for Finitely Additive Probabilities
We study the moment problem for finitely additive probabilities and show that the information provided by the moments is equivalent to the one given by the associated lower and upper distribution functions
Modelling practical certainty and its link with classical propositional logic
We model practical certainty in the language of accept & reject statement-based uncertainty models. We present three different ways, each time using a different nature of assessment: we study coherent models following from (i) favourability assessments, (ii) acceptability assessments, and (iii) indifference assessments. We argue that a statement of favourability, when used with an appropriate background model, essentially boils down to stating a belief of practical certainty using acceptability assessments. We show that the corresponding models do not form an intersection structure, in contradistinction with the coherent models following from an indifferenc assessment. We construct embeddings of classical propositional logic into each of our models for practical certainty
Quantum mechanics as a theory of probability
We develop and defend the thesis that the Hilbert space formalism of quantum
mechanics is a new theory of probability. The theory, like its classical
counterpart, consists of an algebra of events, and the probability measures
defined on it. The construction proceeds in the following steps: (a) Axioms for
the algebra of events are introduced following Birkhoff and von Neumann. All
axioms, except the one that expresses the uncertainty principle, are shared
with the classical event space. The only models for the set of axioms are
lattices of subspaces of inner product spaces over a field K. (b) Another axiom
due to Soler forces K to be the field of real, or complex numbers, or the
quaternions. We suggest a probabilistic reading of Soler's axiom. (c) Gleason's
theorem fully characterizes the probability measures on the algebra of events,
so that Born's rule is derived. (d) Gleason's theorem is equivalent to the
existence of a certain finite set of rays, with a particular orthogonality
graph (Wondergraph). Consequently, all aspects of quantum probability can be
derived from rational probability assignments to finite "quantum gambles". We
apply the approach to the analysis of entanglement, Bell inequalities, and the
quantum theory of macroscopic objects. We also discuss the relation of the
present approach to quantum logic, realism and truth, and the measurement
problem.Comment: 37 pages, 3 figures. Forthcoming in a Festschrift for Jeffrey Bub,
ed. W. Demopoulos and the author, Springer (Kluwer): University of Western
Ontario Series in Philosophy of Scienc
Preference reversal in quantum decision theory
We consider the psychological effect of preference reversal and show that it
finds a natural explanation in the frame of quantum decision theory. When
people choose between lotteries with non-negative payoffs, they prefer a more
certain lottery because of uncertainty aversion. But when people evaluate
lottery prices, e.g. for selling to others the right to play them, they do this
more rationally, being less subject to behavioral biases. This difference can
be explained by the presence of the attraction factors entering the expression
of quantum probabilities. Only the existence of attraction factors can explain
why, considering two lotteries with close utility factors, a decision maker
prefers one of them when choosing, but evaluates higher the other one when
pricing. We derive a general quantitative criterion for the preference reversal
to occur that relates the utilities of the two lotteries to the attraction
factors under choosing versus pricing and test successfully its application on
experiments by Tversky et al. We also show that the planning paradox can be
treated as a kind of preference reversal.Comment: Latex file, 15 page
Comonotonic Book-Making with Nonadditive Probabilities
This paper shows how de Finetti's book-making principle, commonly used to justify additive subjective probabilities, can be modi-ed to agree with some nonexpected utility models.More precisely, a new foundation of the rank-dependent models is presented that is based on a comonotonic extension of the book-making principle.The extension excludes book-making only if all gambles considered induce a same rank-ordering of the states of nature through favorableness of their associated outcomes, and allows for nonadditive probabilities.Typical features of rank-dependence, hedging, ambiguity aversion, and pessimism and optimism, can be accommodated.Book-making;comonotonic;Choquet expected utility;ambiguity aversion;ordered vector space
Irrelevant natural extension for choice functions
We consider coherent choice functions under the recent axiomatisation proposed by De Bock and De Cooman that guarantees a representation in terms of binary preferences, and we discuss how to define conditioning in this framework. In a multivariate context, we propose a notion of marginalisation, and its inverse operation called weak (cylindrical) extension. We combine this with our definition of conditioning to define a notion of irrelevance, and we obtain the irrelevant natural extension in this framework: the least informative choice function that satisfies a given irrelevance assessment
Risk Aversion in International Relations Theory
|When international relations theorists use the concept of risk aversion, they usually cite the economics conception involving concave utility functions. However, concavity is meaningful only when the goal is measurable on an interval scale. International decisions are usually not of this type, so that many statements appearing in the literature are formally meaningless. Applications of prospect theory face this difficulty especially, as risk aversion and acceptance are at their center. This paper gives two definitions of risk attitude that do not require an interval scale. The second and more distinctive one uses the property of submodularity in place of concavity. R. D. Luce has devised a theory of choice with features of prospect theory but not requiring on an interval scale, and the second definition in combination with this theory yields the traditional claim that decision makers are risk-averse for gains and risk-seeking for losses.risk aversion, prospect theory, international relations, joint receipts, measurement theory.
- …