75 research outputs found

    Ambiguity Aversion and Trade

    Get PDF
    What is the effect of ambiguity aversion on trade? Although in a Bewley's model ambiguity aversion always lead to less trade, in other models this is not always true. However, we show that if the endowments are unambiguous then more ambiguity aversion implies less trade, for a very general class of preferences. The reduction in trade caused by ambiguity aversion can be as severe as to lead to no-trade. In an economy with MEU decision makers, we show that if the aggregate endowment is unanimously unambiguous then every Pareto optima allocation is also unambiguous. We also characterize the situation in which every unanimously unambiguous allocation is Pareto optimal. Finally, we show how our results can be used to explain the home-bias effect. As a useful result for our methods, we also obtain an additivity theorem for CEU and MEU decision makers that does not require comonotonicity. JEL Classification Numbers: D51, D6, D8

    Growth Economics and Reality

    Get PDF
    This paper questions current empirical practice in the study of growth. We argue that much of the modern empirical growth literature is based on assumptions concerning regressors, residuals, and parameters which are implausible both from the perspective of economic theory as well as from the perspective of the historical experiences of the countries under study. A number of these problems are argued to be forms of violations of an exchangeability assumption which underlies standard growth exercises. We show that relaxation of these implausible assumptions can be done by allowing for uncertainty in model specification. Model uncertainty consists of two types: theory uncertainty, which relates to which growth determinants should be included in a model, and heterogeneity uncertainty, which relates to which observations in a data set comprise draws from the same statistical model. We propose ways to account for both theory and heterogeneity uncertainty. Finally, using an explicit decision-theoretic framework, we describe how one can engage in policy-relevant empirical analysis.

    De Finetti and Markowitz mean variance approach to reinsurance and portfolio selection problems: a comparison

    Get PDF
    Based on a critical analysis of de Finetti's paper, where the mean variance approach in finance was early introduced to deal with a reinsurance problem, we offer an alternative interpretative key of such an approach to the standard portfolio selection one. We discuss analogies and differences between de Finetti's and Markowitz's geometrical approaches

    Urn-based models for dependent credit risks and their calibration through EM algorithm

    Get PDF
    In this contribution we analyze two models for the joint probability of defaults of dependent credit risks that are based on a generalisation of Polya urn scheme. In particular we focus our attention on the problems related to the maximum likelihood estimation of the parameters involved, and to this purpose we introduce an approach based on the use of the Expectation-Maximization algorithm. We show how to implement it in this context, and then we analyze the results obtained, comparing them with results obtained by other approaches.

    Time-series Modelling, Stationarity and Bayesian Nonparametric Methods

    Get PDF
    In this paper we introduce two general non-parametric first-order stationary time-series models for which marginal (invariant) and transition distributions are expressed as infinite-dimensional mixtures. That feature makes them the first Bayesian stationary fully non-parametric models developed so far. We draw on the discussion of using stationary models in practice, as a motivation, and advocate the view that flexible (non-parametric) stationary models might be a source for reliable inferences and predictions. It will be noticed that our models adequately fit in the Bayesian inference framework due to a suitable representation theorem. A stationary scale-mixture model is developed as a particular case along with a computational strategy for posterior inference and predictions. The usefulness of that model is illustrated with the analysis of Euro/USD exchange rate log-returns.Stationarity, Markov processes, Dynamic mixture models, Random probability measures, Conditional random probability measures, Latent processes.

    Coherent frequentism

    Full text link
    By representing the range of fair betting odds according to a pair of confidence set estimators, dual probability measures on parameter space called frequentist posteriors secure the coherence of subjective inference without any prior distribution. The closure of the set of expected losses corresponding to the dual frequentist posteriors constrains decisions without arbitrarily forcing optimization under all circumstances. This decision theory reduces to those that maximize expected utility when the pair of frequentist posteriors is induced by an exact or approximate confidence set estimator or when an automatic reduction rule is applied to the pair. In such cases, the resulting frequentist posterior is coherent in the sense that, as a probability distribution of the parameter of interest, it satisfies the axioms of the decision-theoretic and logic-theoretic systems typically cited in support of the Bayesian posterior. Unlike the p-value, the confidence level of an interval hypothesis derived from such a measure is suitable as an estimator of the indicator of hypothesis truth since it converges in sample-space probability to 1 if the hypothesis is true or to 0 otherwise under general conditions.Comment: The confidence-measure theory of inference and decision is explicitly extended to vector parameters of interest. The derivation of upper and lower confidence levels from valid and nonconservative set estimators is formalize

    Probability models for information retrieval based on divergence from randomness

    Get PDF
    This thesis devises a novel methodology based on probability theory, suitable for the construction of term-weighting models of Information Retrieval. Our term-weighting functions are created within a general framework made up of three components. Each of the three components is built independently from the others. We obtain the term-weighting functions from the general model in a purely theoretic way instantiating each component with different probability distribution forms. The thesis begins with investigating the nature of the statistical inference involved in Information Retrieval. We explore the estimation problem underlying the process of sampling. De Finetti’s theorem is used to show how to convert the frequentist approach into Bayesian inference and we display and employ the derived estimation techniques in the context of Information Retrieval. We initially pay a great attention to the construction of the basic sample spaces of Information Retrieval. The notion of single or multiple sampling from different populations in the context of Information Retrieval is extensively discussed and used through-out the thesis. The language modelling approach and the standard probabilistic model are studied under the same foundational view and are experimentally compared to the divergence-from-randomness approach. In revisiting the main information retrieval models in the literature, we show that even language modelling approach can be exploited to assign term-frequency normalization to the models of divergence from randomness. We finally introduce a novel framework for the query expansion. This framework is based on the models of divergence-from-randomness and it can be applied to arbitrary models of IR, divergence-based, language modelling and probabilistic models included. We have done a very large number of experiment and results show that the framework generates highly effective Information Retrieval models

    Nonparametric predictive inference with right-censored data

    Get PDF
    This thesis considers nonparametric predictive inference for lifetime data that include right-censored observations. The assumption A((_m)) proposed by Hill in 1968 provides a partially specified predictive distribution for a future observation given past observations. But it does not allow right-censored data among the observations. Although Berliner and Hill in 1988 presented a related nonparametric method for dealing with right-censored data based on A((_n)), they replaced 'exact censoring information' (ECI) by 'partial censoring information' (PCI), enabling inference on the basis of A((_n)). We address if ECI can be used via a generalization of A((_n)).We solve this problem by presenting a new assumption 'right-censoring A((_n))' (rc- A((_n)), which generalizes A((_n)). The assumption rc- A((_n)) presents a partially specified predictive distribution for a future observation, given the past observations including right-censored data, and allows the use of ECI. Based on rc-A((_n)), we derive nonparametric predictive inferences (NPI) for a future observation, which can also be applied to a variety of predictive problems formulated in terms of the future observation. As applications of NPI, we discuss grouped data and comparison of two groups of lifetime data, which are problems occurring frequently in reliability and survival analysis
    • …
    corecore