258,955 research outputs found

    A Quantum Cognition Analysis of the Ellsberg Paradox

    Full text link
    The 'expected utility hypothesis' is one of the foundations of classical approaches to economics and decision theory and Savage's 'Sure-Thing Principle' is a fundamental element of it. It has been put forward that real-life situations exist, illustrated by the 'Allais' and 'Ellsberg paradoxes', in which the Sure-Thing Principle is violated, and where also the expected utility hypothesis does not hold. We have recently presented strong arguments for the presence of a double layer structure, a 'classical logical' and a 'quantum conceptual', in human thought and that the quantum conceptual mode is responsible of the above violation. We consider in this paper the Ellsberg paradox, perform an experiment with real test subjects on the situation considered by Ellsberg, and use the collected data to elaborate a model for the conceptual landscape surrounding the decision situation of the paradox. We show that it is the conceptual landscape which gives rise to a violation of the Sure-Thing Principle and leads to the paradoxical situation discovered by Ellsberg.Comment: 11 page

    Maximizing Expected Utility for Stochastic Combinatorial Optimization Problems

    Full text link
    We study the stochastic versions of a broad class of combinatorial problems where the weights of the elements in the input dataset are uncertain. The class of problems that we study includes shortest paths, minimum weight spanning trees, and minimum weight matchings, and other combinatorial problems like knapsack. We observe that the expected value is inadequate in capturing different types of {\em risk-averse} or {\em risk-prone} behaviors, and instead we consider a more general objective which is to maximize the {\em expected utility} of the solution for some given utility function, rather than the expected weight (expected weight becomes a special case). Under the assumption that there is a pseudopolynomial time algorithm for the {\em exact} version of the problem (This is true for the problems mentioned above), we can obtain the following approximation results for several important classes of utility functions: (1) If the utility function \uti is continuous, upper-bounded by a constant and \lim_{x\rightarrow+\infty}\uti(x)=0, we show that we can obtain a polynomial time approximation algorithm with an {\em additive error} ϵ\epsilon for any constant ϵ>0\epsilon>0. (2) If the utility function \uti is a concave increasing function, we can obtain a polynomial time approximation scheme (PTAS). (3) If the utility function \uti is increasing and has a bounded derivative, we can obtain a polynomial time approximation scheme. Our results recover or generalize several prior results on stochastic shortest path, stochastic spanning tree, and stochastic knapsack. Our algorithm for utility maximization makes use of the separability of exponential utility and a technique to decompose a general utility function into exponential utility functions, which may be useful in other stochastic optimization problems.Comment: 31 pages, Preliminary version appears in the Proceeding of the 52nd Annual IEEE Symposium on Foundations of Computer Science (FOCS 2011), This version contains several new results ( results (2) and (3) in the abstract

    An Evolutionary Perspective on Goal Seeking and Escalation of Commitment

    Get PDF
    Maximizing the probability of bypassing an aspiration level, and taking increasing risks to recover previous losses are well-documented behavioral tendencies. They are compatible with individual utility functions that are S-shaped, as suggested in Prospect Theory (Kahneman and Tversky 1979). We explore evolutionary foundations for such preferences. Idiosyncratic innovative activity, while individually risky, enhances the fitness of society because it provides hedging against aggregate disasters that might occur if everybody had pursued the same course of action. In order that individuals choose the socially optimal dosage of innovative activity, the individuals’ preferences should make them strive to improve upon the on-going convention, even if it implies taking gambles that reduce their expected achievements. We show how, in a formal model, the preferences that will be selected for in the course of evolution lead to maximizing the probability of bypassing an aspiration level. Furthermore, when comparing choices with the same probability of achieving this goal, preference is indeed established by maximizing the expected utility of an S-shaped utility function, exhibiting risk loving below the aspiration level and risk aversion beyond it

    Formalizing Preferences Over Runtime Distributions

    Full text link
    When trying to solve a computational problem, we are often faced with a choice between algorithms that are guaranteed to return the right answer but differ in their runtime distributions (e.g., SAT solvers, sorting algorithms). This paper aims to lay theoretical foundations for such choices by formalizing preferences over runtime distributions. It might seem that we should simply prefer the algorithm that minimizes expected runtime. However, such preferences would be driven by exactly how slow our algorithm is on bad inputs, whereas in practice we are typically willing to cut off occasional, sufficiently long runs before they finish. We propose a principled alternative, taking a utility-theoretic approach to characterize the scoring functions that describe preferences over algorithms. These functions depend on the way our value for solving our problem decreases with time and on the distribution from which captimes are drawn. We describe examples of realistic utility functions and show how to leverage a maximum-entropy approach for modeling underspecified captime distributions. Finally, we show how to efficiently estimate an algorithm's expected utility from runtime samples

    Additive utility in prospect theory

    Get PDF
    Prospect theory is currently the main descriptive theory of decision under uncertainty. It generalizes expected utility by introducing nonlinear decision weighting and loss aversion. A difficulty in the study of multiattribute utility under prospect theory is to determine when an attribute yields a gain or a loss. One possibility, adopted in the theoretical literature on multiattribute utility under prospect theory, is to assume that a decision maker determines whether the complete outcome is a gain or a loss. In this holistic evaluation, decision weighting and loss aversion are general and attribute-independent. Another possibility, more common in the empirical literature, is to assume that a decision maker has a reference point for each attribute. We give preference foundations for this attribute-specific evaluation where decision weighting and loss aversion are depending on the attributes

    Other-Regarding Preferences and Consequentialism

    Get PDF
    This dissertation addresses a basic difficulty in accommodating other-regarding preferences within existing models of decision making. Decision makers with such preferences may violate the property of stochastic dominance that is shared by both expected utility and almost any model of non-expected utility. At its core, stochastic dominance requires a decision maker\u27s behavior to conform to a basic form of consequentialism, namely, that her ranking of outcomes should be independent of the stochastic process that generates these outcomes. On the other hand, decision makers with other-regarding preferences may show a concern for procedures; that is they may care not just about what the outcomes of others are but also about how these outcomes are generated and therefore their ranking of outcomes may be intrinsically dependent on the outcome-generating process. We provide theoretical foundations for a new representation of other-regarding preferences that accommodates concerns for procedure and possible violations of stochastic dominance. Our axioms provide a sharp characterization of how a decision maker\u27s ranking of outcomes depends on the procedure by expressing `payoffs\u27 as a weighted average of her concerns for outcomes, and her concerns for procedure. The weight used in evaluating this weighted average, which we call the procedural weight, is uniquely determined and quantifies the relative importance of procedural concerns. In the special case in which procedural concerns are absent our baseline decision model reduces to expected utility, and our most parsimonious representation is one parameter richer than that model. We use our decision model to provide an expressive theory of voting

    Behavioural Economics: Classical and Modern

    Get PDF
    In this paper, the origins and development of behavioural economics, beginning with the pioneering works of Herbert Simon (1953) and Ward Edwards (1954), is traced, described and (critically) discussed, in some detail. Two kinds of behavioural economics – classical and modern – are attributed, respectively, to the two pioneers. The mathematical foundations of classical behavioural economics is identified, largely, to be in the theory of computation and computational complexity; the corresponding mathematical basis for modern behavioural economics is, on the other hand, claimed to be a notion of subjective probability (at least at its origins in the works of Ward Edwards). The economic theories of behavior, challenging various aspects of 'orthodox' theory, were decisively influenced by these two mathematical underpinnings of the two theoriesClassical Behavioural Economics, Modern Behavioural Economics, Subjective Probability, Model of Computation, Computational Complexity. Subjective Expected Utility

    Rationality, uncertainty aversion and equilibrium concepts in normal and extensive form games

    Get PDF
    This thesis contributes to a re-examination and extension of the equilibrium concept in normal and extensive form games. The equilibrium concept is a solution concept for games that is consistent with individual rationality and various assumptions about players' knowledge about the nature of their strategic interaction. The thesis argues that further consistency conditions can be imposed on a rational solution concept. By its very nature, a rational solution concept implicitly defines which strategies are non-rational. A rational player's beliefs about play by non-rational opponents should be consistent with this implicit definition of non-rational play. The thesis shows that equilibrium concepts that satisfy additional consistency requirements can be formulated in Choquet-expected utility theory, i.e. non-expected utility theory with non-additive or set-valued beliefs, together with an empirical assumption about players' attitude toward uncertainty. Chapter 1 introduces the background of this thesis. We present the conceptual problems in the foundations of game theory that motivate our approach. We then survey the decision-theoretic foundations of Choquet-expected utility theory and game-theoretic applications of Choquet-expected utility theory that are related to the present approach. Chapter 2 formulates this equilibrium concept for normal form games. This concept, called Choquet-Nash Equilibrium, is shown to be a generalization of Nash Equilibrium in normal form games. We establish an existence result for finite games, derive various properties of equilibria and establish robustness results for Nash equilibria. Chapter 3 extends the analysis to extensive games. We present the equivalent of subgame-perfect equilibrium, called perfect Choquet Equilibrium, for extensive games. Our main finding here is that perfect Choquet equilibrium does not generalize, but is qualitatively different from subgame-perfect equilibrium. Finally, in chapter 4 we examine the centipede game. It is shown that the plausible assumption of bounded uncertainty aversion leads to an 'interior' equilibrium of the centipede game

    Shackle versus Savage: non-probabilistic alternatives to subjective probability theory in the 1950s

    Get PDF
    G.L.S Shackle’s rejection of the probability tradition stemming from Knight's definition of uncertainty was a crucial episode in the development of modern decision theory. A set of methodological statements characterizing Shackle’s stance, abandoned for long, especially after Savage’s Foundations, have been re-discovered and are at the basis of current non-expected utility theories, in particular of the non-additive probability approach to decision making. This paper examines the discussion between Shackle and his critics in the 1950s. Drawing on Shackle’s papers housed at Cambridge University Library as well as on printed matter, we show that some critics correctly understood two aspects of Shackle’s theory which are of the utmost importance in our view: the non-additive character of the theory and the possibility of interpreting Shackle’s ascendancy functions as a specific distortion of the weighting function of the decision maker. It is argued that Shackle neither completely understood criticisms nor appropriately developed suggestions put forward by scholars like Kenneth Arrow, Ward Edwards, Nicholas Georgescu- Roegen. Had he succeeded in doing so, we contend, his theory might have been a more satisfactory alternative to Savage’s theory than it actually was.uncertainty, decision theory, non-additive measures
    corecore