135 research outputs found

    Mythic Perspectives in George Eliot\u27s Fiction

    Get PDF
    George Eliot developed a systematic sense of myth and mythmaking when she read and translated D.F. Strauss\u27s Das Leben Jesu kritish bearbeitet into The Life of Jesus Critically Examined (1846). Strauss carries on his monumental work of demythologizing the gospels by affirming that myth represents the truth of human feelings and aspirations and that these themselves are an expression of the Idea that divinity and humanity are to be united. This Hegelian theory of the place of primitive Christianity in the evolution of man\u27s historical destiny depends for its elaboration on the conception of myth that Strauss derived from Karl Otfried Muller\u27s book of 1825 Prolegomena zu einer wissenschaftlichen Mythologie, which was translated by John Leitch in 1844 as Introduction to a Scientific System of Mythology. A brilliant classical scholar, Muller argued that \u27mythical images were formed by the influence of sentiments common to all mankind\u27 and that the articulator of myth simply obeys \u27the impulse which acts also upon the minds of his hearers, he is but the mouth through which all speak, the skillful interpreter who has address first to give form and expression to the thoughts of all: What the community affirms, the mythmaker expresses. The myth that arises is the poetry of their beliefs. George Eliot reformulates Muller\u27s sense of myth in three places in her canon. In \u27Janet\u27s Repentance\u27, when Bill Powers is encouraged by Dempster to incite the crowd against Mr. Tryan, the narrator ironically likens him to \u27the enunciator of ancient myth who makes the assemblage distinctly conscious of the common sentiment that had drawn them together: In Silas Mamer the antique Mr. Macey is presented as the mythmaker of Raveloe. He is described as an \u27oracular old gentleman\u27 in chapter 7; and the epithet \u27oracular\u27 seems especially important here because George Eliot emended her manuscript to insert the word in her text. Mr. Macey\u27s recital of the story of the Lammeter-Osgood wedding and the story of how the Warrens become the Charity land is ritualistic. These stories are the heritage of the community, all listen attentively to them, and a certain few ask premeditated questions at designated moments in the course of the recital. \u27Every one of Mr. Macey\u27s audience had heard this story many times, but it was listened to as if it had been a favourite tune, and at certain points the puffing of pipes was momentarily suspended, that the listeners might give their whole minds to the expected words. But there was more to come; and Mr. Snell, the landlord, duly put the leading question\u27 (Ch. 6). Mr. Macey\u27s voice is heard as the single voice that orchestrates the belief of the many

    Confidence, Pessimism and their Impact on Product Differentiation in a Hotelling Model with Demand Location Uncertainty

    Get PDF
    We analyze a Hotelling location-then-price duopoly game under demand uncertainty with uniformly distributed consumers in a standard quadratic costs scenario. The novelty of our approach consists of assuming that firms' beliefs are represented by non-extreme-outcome-additive (neo-additive) capacities. We derive firms' subgame-perfect product design decisions under ambiguity. Furthermore, we investigate the influence of ambiguity and ambiguity attitude on equilibrium product differentiation and contrast our results with an environment of risky firms. We find that the impact of the degree of confidence or ambiguity is particularly significant when it comes to delivering accurate explanations for a wide range of phenomena related to observed product design behavior

    Decomposition of Explained Variation in the Linear Mixed Model

    Full text link
    In the linear mixed model (LMM), the simultaneous assessment and comparison of dispersion relevance of explanatory variables associated with fixed and random effects remains an important open practical problem. Based on the restricted maximum likelihood equations in the variance components form of the LMM, we prove a proper decomposition of the sum of squares of the dependent variable into unbiased estimators of interpretable estimands of explained variation. This result leads to a natural extension of the well-known adjusted coefficient of determination to the LMM. Further, we allocate the novel unbiased estimators of explained variation to specific contributions of covariates associated with fixed and random effects within a single model fit. These parameter-wise explained variations constitute easily interpretable quantities, assessing dispersion relevance of covariates associated with both fixed and random effects on a common scale, thus allowing for a covariate ranking. For illustration, we contrast the variation explained by subjects and time in the longitudinal sleep deprivation study. By comparing the dispersion relevance of population characteristics and spatial levels, we determine literacy as a major driver of income inequality in Burkina Faso. Finally, we develop a novel relevance plot to visualize the dispersion relevance of high-dimensional genomic markers in Arabidopsis thaliana

    Informativeness of Experiments for MEU - A Recursive Definition

    Get PDF
    The well-known Blackwell's theorem states the equivalence of statistical informativeness and economic valuableness. Celen (2012) generalizes this theorem, which is well-known for subjective expected utility (SEU), to maxmin expected utility (MEU) preferences. We demonstrate that the underlying definition of the value of information used in Celen (2012) is in contradiction with the principle of recursively defined utility. As a consequence, Celen's framework features dynamic inconsistency. Our main contribution consists in the definition of a value of information for MEU preferences that is compatible with recursive utility and thus respects dynamic consistency

    Risk Assessment under Ambiguity: Precautionary Learning vs. Research Pessimism

    Get PDF
    Agencies charged with regulating complex risks such as food safety or novel substances frequently need to take decisions on risk assessment and risk management under conditions of ambiguity, i.e. where probabilities cannot be assigned to possible outcomes of regulatory actions. What mandates should society write for such agencies? Two approaches stand out in the current discussion. One charges the agency to apply welfare economics based on expected utility theory. This approach underpins conventional cost-benet analysis (CBA). The other requires that an ambiguity-averse decision-rule - of which maxmin expected utility (MEU) is the best known - be applied in order to build a margin of safety in accordance with the Precautionary Principle (PP). The contribution of the present paper is a relative assessment of how a CBA and a PP mandate impact on the regulatory task of risk assessment. In our parsimonious model, a decision maker can decide on the precision of a signal which provides noisy information on a payoff-relevant parameter. We find a complex interplay of MEU on information acquisition shaped by two countervailing forces that we dub 'Precautionary Learning' and 'Research Pessimism'. We find that - contrary to intuition - a mandate of PP rather than CBA will often give rise to a less informed regulator. PP can therefore lead to a higher likelihood of regulatory mistakes, such as the approval of harmful new substances

    The finite sample performance of semi- and nonparametric estimators for treatment effects and policy evaluation

    Get PDF
    This paper investigates the fi nite sample performance of a comprehensive set of semi- and nonparametric estimators for treatment and policy evaluation. In contrast to previous simulation studies which mostly considered semiparametric approaches relying on parametric propensity score estimation, we also consider more fl exible approaches based on semi- or nonparametric propensity scores, nonparametric regression, and direct covariate matching. In addition to (pair, radius, and kernel) matching, inverse probability weighting, regression, and doubly robust estimation, our studies also cover recently proposed estimators such as genetic matching, entropy balancing, and empirical likelihood estimation. We vary a range of features (sample size, selection into treatment, effect heterogeneity, and correct/misspecification) in our simulations and fi nd that several nonparametric estimators by and large outperform commonly used treatment estimators using a parametric propensity score. Nonparametric regression, nonparametric doubly robust estimation, nonparametric IPW, and one-to-many covariate matching perform best

    Ambiguity and Economic Models

    Get PDF
    This thesis is based on a collection of essays and studies the behavioral consequences of the concept of ambiguity for a variety of economic models. After introducing the reader to the fundamentals of decision-theory, I proceed by considering a Hotelling duopoly game under demand ambiguity. Firms' preferences are assumed to be of the Choquet-expected-utility-type. In this framework, I derive firms' subgame-perfect product differentiation. It turns out that confidence is a differentiation force when firms are sufficiently optimistic. This finding has important consequences for the interpretation of various applications of Hotelling models under uncertainty treated by Król (2012). Subsequently, ambiguity is implemented in the context of primary prevention. In particular, I contemplate a physician-counseling model where Choquet-expected-utility-maximizing patients face ambiguity with respect to the relationship between their level of adherence to a preventive regime and the resulting probability of disease. In this framework, I examine the effect of confidence and optimism on preventive activities. It turns out that the effect of optimism on prevention is determined by two concurrent effects, which are denoted as "perceived efficacy effect" and "expected marginal utility effect". The perceived efficacy effect captures the fact that optimists and pessimists might differ in their assessment of the preventive regime's capability to reduce the underlying probability of disease. The expected marginal utility effect takes into account that a shift in the perceived disease probability might increase or decrease marginal gains or losses from additional units of prevention. In the following step, I introduce information into the previous setting. Information is modeled by means of an imprecise signal provided by the physician. Patients update their beliefs in the light of new information by using one of the three major updating rules for neo-additive capacities introduced by Eichberger et al. (2010). It turns out that Knightian uncertainty can, depending on the underlying updating rule, provide an explanation for poor patient compliance as well as excessive preventive behavior. The ensuing chapter turns to the famous Blackwell's theorem and its extension to MEU-preferences proposed by Ҫelen (2012) showing that Ҫelen’s notion of a value of information under MEU is incompatible with dynamic consistency. Finally, I conclude with an application of ambiguity to monopoly pricing. Using the most prominent models of decision-making under ambiguity, I derive the implications of ambiguity for monopoly pricing. In the Choquet case with neo-additive capacities, I can demonstrate that pessimism reduces the resulting monopoly price whereas confidence decreases (increases) monopoly prices if the monopolist is sufficiently optimistic (pessimistic)

    Estimation of a regression spline sample selection model

    Get PDF
    It is often the case that an outcome of interest is observed for a restricted non-randomly selected sample of the population. In such a situation, standard statistical analysis yields biased results. This issue can be addressed using sample selection models which are based on the estimation of two regressions: a binary selection equation determining whether a particular statistical unit will be available in the outcome equation. Classic sample selection models assume a priori that continuous regressors have a pre-specified linear or non-linear relationship to the outcome, which can lead to erroneous conclusions. In the case of continuous response, methods in which covariate effects are modeled flexibly have been previously proposed, the most recent being based on a Bayesian Markov chain Monte Carlo approach. A frequentist counterpart which has the advantage of being computationally fast is introduced. The proposed algorithm is based on the penalized likelihood estimation framework. The construction of confidence intervals is also discussed. The empirical properties of the existing and proposed methods are studied through a simulation study. The approaches are finally illustrated by analyzing data from the RAND Health Insurance Experiment on annual health expenditures
    • …
    corecore