39 research outputs found

    Comonotonic Book-Making with Nonadditive Probabilities

    Get PDF
    This paper shows how de Finetti's book-making principle, commonly used to justify additive subjective probabilities, can be modi-ed to agree with some nonexpected utility models.More precisely, a new foundation of the rank-dependent models is presented that is based on a comonotonic extension of the book-making principle.The extension excludes book-making only if all gambles considered induce a same rank-ordering of the states of nature through favorableness of their associated outcomes, and allows for nonadditive probabilities.Typical features of rank-dependence, hedging, ambiguity aversion, and pessimism and optimism, can be accommodated.Book-making;comonotonic;Choquet expected utility;ambiguity aversion;ordered vector space

    Genetic Algorithm Optimization for Determining Fuzzy Measures from Fuzzy Data

    Get PDF
    Fuzzy measures and fuzzy integrals have been successfully used in many real applications. How to determine fuzzy measures is a very difficult problem in these applications. Though there have existed some methodologies for solving this problem, such as genetic algorithms, gradient descent algorithms, neural networks, and particle swarm algorithm, it is hard to say which one is more appropriate and more feasible. Each method has its advantages. Most of the existed works can only deal with the data consisting of classic numbers which may arise limitations in practical applications. It is not reasonable to assume that all data are real data before we elicit them from practical data. Sometimes, fuzzy data may exist, such as in pharmacological, financial and sociological applications. Thus, we make an attempt to determine a more generalized type of general fuzzy measures from fuzzy data by means of genetic algorithms and Choquet integrals. In this paper, we make the first effort to define the Ļƒ-Ī» rules. Furthermore we define and characterize the Choquet integrals of interval-valued functions and fuzzy-number-valued functions based on Ļƒ-Ī» rules. In addition, we design a special genetic algorithm to determine a type of general fuzzy measures from fuzzy data

    Effects of Clonal Reproduction of Evolutionary Lag and Evolutionary Rescue

    Get PDF
    Evolutionary lagā€”the difference between mean and optimal phenotype in the current environmentā€”is of keen interest in light of rapid environmental change. Many ecologically important organisms have life histories that include stage structure and both sexual and clonal reproduction, yet how stage structure and clonality interplay to govern a populationā€™s rate of evolution and evolutionary lag is unknown. Effects of clonal reproduction on mean phenotype partition into two portions: one that is phenotype dependent, and another that is genotype dependent. This partitioning is governed by the association between the nonadditive genetic plus random environmental component of phenotype of clonal offspring and their parents. While clonality slows phenotypic evolution toward an optimum, it can dramatically increase population survival after a sudden step change in optimal phenotype. Increased adult survival slows phenotypic evolution but facilitates population survival after a step change; this positive effect can, however, be lost given survival-fecundity trade-offs. Simulations indicate that the benefits of increased clonality under environmental change greatly depend on the nature of that change: increasing population persistence under a step change while decreasing population persistence under a continuous linear change requiring de novo variation. The impact of clonality on the probability of persistence for species in a changing world is thus inexorably linked to the temporal texture of the change they experience

    Nonexpected Utility and Coherence.

    Get PDF
    The descriptive power of expected utility has been challenged by behavioral evidence showing that people deviate systematically from the expected utility paradigm. Since the end of the 70's several alternatives to the classical expected utility paradigm have been proposed in order to accommodate these deviations. This thesis examines from a theoretical and experimental point of view the rank-dependent models, currently the most influential alternative to expected utility. The rank-dependent models propose a natural extension of expected utility by permitting nonlinear weighting of probabilities through decision weights. This thesis investigates the intuition underlying the rank-dependent models, the extension of de Finetti's coherence principle to the rank-dependent case, and the experimental elicitation of the decision weights. In addition, an alternative generalization of expected utility, the gambling effect model, is presented.

    Rank-Order Tournaments as Optimum Labor Contracts

    Get PDF
    This paper analyzes compensation schemes which pay according to an individual's ordinal rank in an organization rather than his output level. When workers are risk neutral, it is shown that wages based upon rank induce the same efficient allocation of resources as an incentive reward scheme based on individual output levels. Under some circumstances, risk-averse workers actually prefer to be paid on the basis of rank. In addition, if workers are heterogeneous inability, low-quality workers attempt to contaminate high-quality firms, resulting in adverse selection. However, if ability is known in advance, a competitive handicapping structure exists which allows all workers to compete efficiently in the same organization.

    Probabilistic Reasoning in Cosmology

    Get PDF
    Cosmology raises novel philosophical questions regarding the use of probabilities in inference. This work aims at identifying and assessing lines of arguments and problematic principles in probabilistic reasoning in cosmology. The first, second, and third papers deal with the intersection of two distinct problems: accounting for selection effects, and representing ignorance or indifference in probabilistic inferences. These two problems meet in the cosmology literature when anthropic considerations are used to predict cosmological parameters by conditionalizing the distribution of, e.g., the cosmological constant on the number of observers it allows for. However, uniform probability distributions usually appealed to in such arguments are an inadequate representation of indifference, and lead to unfounded predictions. It has been argued that this inability to represent ignorance is a fundamental flaw of any inductive framework using additive measures. In the first paper, I examine how imprecise probabilities fare as an inductive framework and avoid such unwarranted inferences. In the second paper, I detail how this framework allows us to successfully avoid the conclusions of Doomsday arguments in a way no Bayesian approach that represents credal states by single credence functions could. There are in the cosmology literature several kinds of arguments referring to self- locating uncertainty. In the multiverse framework, different pocket-universes may have different fundamental physical parameters. We donā€™t know if we are typical observers and if we can safely assume that the physical laws we draw from our observations hold elsewhere. The third paper examines the validity of the appeal to the Sleeping Beauty problem and assesses the nature and role of typicality assumptions often endorsed to handle such questions. A more general issue for the use of probabilities in cosmology concerns the inadequacy of Bayesian and statistical model selection criteria in the absence of well-motivated measures for different cosmological models. The criteria for model selection commonly used tend to focus on optimizing the number of free parameters, but they can select physically implausible models. The fourth paper examines the possibility for Bayesian model selection to circumvent the lack of well-motivated priors
    corecore