5,950 research outputs found

    From imprecise probability assessments to conditional probabilities with quasi additive classes of conditioning events

    Get PDF
    In this paper, starting from a generalized coherent (i.e. avoiding uniform loss) intervalvalued probability assessment on a finite family of conditional events, we construct conditional probabilities with quasi additive classes of conditioning events which are consistent with the given initial assessment. Quasi additivity assures coherence for the obtained conditional probabilities. In order to reach our goal we define a finite sequence of conditional probabilities by exploiting some theoretical results on g-coherence. In particular, we use solutions of a finite sequence of linear systems.Comment: Appears in Proceedings of the Twenty-Eighth Conference on Uncertainty in Artificial Intelligence (UAI2012

    Short period and long period in macroeconomics: an awkward distinction

    Get PDF
    Abstract: The aim of this paper is to show that the use and meaning of the well-known concepts of short period and long period is often unclear and may be seriously misleading when applied to macroeconomic analysis. Evidence of this confusion emerges through examination of four macroeconomics textbooks and reappraisal of the interpretative debate - which took place mainly in the 1980s and 1990s - aiming at establishing whether Keynes’s General Theory should be considered as a short- or long-period analysis of the aggregate level of production. Having explored some possible explanations for the difficulties in defining and applying these methodological tools at a ‘macro’ level, the conclusion is suggested that it would be preferable to abandon this terminology in classifying different aggregate models and simply to make explicit the given factors, independent and dependent variables in each model in use, exactly as Keynes did in Chapter 18 of his major work.

    Probabilistic entailment in the setting of coherence: The role of quasi conjunction and inclusion relation

    Full text link
    In this paper, by adopting a coherence-based probabilistic approach to default reasoning, we focus the study on the logical operation of quasi conjunction and the Goodman-Nguyen inclusion relation for conditional events. We recall that quasi conjunction is a basic notion for defining consistency of conditional knowledge bases. By deepening some results given in a previous paper we show that, given any finite family of conditional events F and any nonempty subset S of F, the family F p-entails the quasi conjunction C(S); then, given any conditional event E|H, we analyze the equivalence between p-entailment of E|H from F and p-entailment of E|H from C(S), where S is some nonempty subset of F. We also illustrate some alternative theorems related with p-consistency and p-entailment. Finally, we deepen the study of the connections between the notions of p-entailment and inclusion relation by introducing for a pair (F,E|H) the (possibly empty) class K of the subsets S of F such that C(S) implies E|H. We show that the class K satisfies many properties; in particular K is additive and has a greatest element which can be determined by applying a suitable algorithm

    Algebraic aspects and coherence conditions for conjunctions among conditional events

    Get PDF
    We deepen the study of a notion of conjunction among conditional events, introduced in previous papers in theframework of coherence. This notion of conjunction, differently from other approaches, is given in the setting ofconditional random quantities. We show that some well known properties which are satisfied by conjunctionsof unconditional events are also satisfied by conjunctions of conditional events. In particular we examine anadditive property and a decomposition formula, by also obtaining a generalized inclusion-exclusion formula. Then,by exploiting the notion of conjunction, we introduce the set of constituents generated bynconditional events.Moreover, under logical independence, we give a necessary and sufficient condition of coherence for the previsionassessments on a familyFconstituted bynconditional events and all possible conjunctions among some of them.This condition of coherence has a simple geometrical characterization in terms of a suitable convex hull. Such acharacterization amounts to the solvability of a linear system as in the case of unconditional events. Then, weillustrate the set of all coherent assessments on the familyFby a list of linear inequalities on the componentsof the prevision assessment. Finally, given a coherent assessmentMonF, we show that every possible value ofthe random vector associated withFis itself a particular coherent assessment onF

    Lattice QCD estimate of the ηc(2S)→J/ÏˆÎł\eta_{c}(2S)\to J/\psi\gamma decay rate

    Full text link
    We compute the hadronic matrix element relevant to the physical radiative decay ηc(2S)→J/ÏˆÎł\eta_{c}(2S)\to J/\psi\gamma by means of lattice QCD. We use the (maximally) twisted mass QCD action with Nf=2 light dynamical quarks and from the computations made at four lattice spacings we were able to take the continuum limit. The value of the mass ratio mηc(2S)/mηc(1S)m_{\eta_c(2S)}/m_{\eta_c(1S)} we obtain is consistent with the experimental value, and our prediction for the form factor is Vηc(2S)→J/ÏˆÎł(0)≡V12(0)=0.32(6)(2)V^{\eta_{c}(2S)\to J/\psi\gamma}(0)\equiv V_{12}(0)=0.32(6)(2), leading to Γ(ηc(2S)→J/ÏˆÎł)=(15.7±5.7)\Gamma(\eta_c (2S) \to J/\psi\gamma) = (15.7\pm 5.7) keV, which is much larger than Γ(ψ(2S)→ηcÎł)\Gamma(\psi (2S) \to \eta_c\gamma) and within reach of modern experiments.Comment: 19 pages, 4 fig

    Behavioral Foundations for the Keynesian Consumption Function

    Get PDF
    This paper has two main goals. The first is to show that behavioral rather than maximizing principles emerge from textual analysis as the microeconomic foundations for Keynes’s Consumption Theory; the second goal is to demonstrate that it is possible to ground a Keynesian-type aggregate Consumption function on the basis of (some of) the principles underlying contemporary behavioral modelsKeynes, Behavioral Economics, Keynesian Theory, Consumption, Hyperbolic Discounting, Mental Accounting

    Extropy: Complementary Dual of Entropy

    Get PDF
    This article provides a completion to theories of information based on entropy, resolving a longstanding question in its axiomatization as proposed by Shannon and pursued by Jaynes. We show that Shannon's entropy function has a complementary dual function which we call "extropy." The entropy and the extropy of a binary distribution are identical. However, the measure bifurcates into a pair of distinct measures for any quantity that is not merely an event indicator. As with entropy, the maximum extropy distribution is also the uniform distribution, and both measures are invariant with respect to permutations of their mass functions. However, they behave quite differently in their assessments of the refinement of a distribution, the axiom which concerned Shannon and Jaynes. Their duality is specified via the relationship among the entropies and extropies of course and fine partitions. We also analyze the extropy function for densities, showing that relative extropy constitutes a dual to the Kullback-Leibler divergence, widely recognized as the continuous entropy measure. These results are unified within the general structure of Bregman divergences. In this context they identify half the L2L_2 metric as the extropic dual to the entropic directed distance. We describe a statistical application to the scoring of sequential forecast distributions which provoked the discovery.Comment: Published at http://dx.doi.org/10.1214/14-STS430 in the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org
    • 

    corecore