115 research outputs found

    Discussion of: A statistical analysis of multiple temperature proxies: Are reconstructions of surface temperatures over the last 1000 years reliable?

    Full text link
    Discussion of "A statistical analysis of multiple temperature proxies: Are reconstructions of surface temperatures over the last 1000 years reliable?" by B.B. McShane and A.J. Wyner [arXiv:1104.4002]Comment: Published in at http://dx.doi.org/10.1214/10-AOAS409 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Uncertainty in climate science and climate policy

    Get PDF
    This essay, written by a statistician and a climate scientist, describes our view of the gap that exists between current practice in mainstream climate science, and the practical needs of policymakers charged with exploring possible interventions in the context of climate change. By `mainstream' we mean the type of climate science that dominates in universities and research centres, which we will term `academic' climate science, in contrast to `policy' climate science; aspects of this distinction will become clearer in what follows. In a nutshell, we do not think that academic climate science equips climate scientists to be as helpful as they might be, when involved in climate policy assessment. Partly, we attribute this to an over-investment in high resolution climate simulators, and partly to a culture that is uncomfortable with the inherently subjective nature of climate uncertainty.Comment: submitted as contribution to Conceptual Foundations of ClimateModeling, Winsberg, E. and Lloyd, E., eds., The University of Chicago Pres

    On the use of simple dynamical systems for climate predictions: A Bayesian prediction of the next glacial inception

    Full text link
    Over the last few decades, climate scientists have devoted much effort to the development of large numerical models of the atmosphere and the ocean. While there is no question that such models provide important and useful information on complicated aspects of atmosphere and ocean dynamics, skillful prediction also requires a phenomenological approach, particularly for very slow processes, such as glacial-interglacial cycles. Phenomenological models are often represented as low-order dynamical systems. These are tractable, and a rich source of insights about climate dynamics, but they also ignore large bodies of information on the climate system, and their parameters are generally not operationally defined. Consequently, if they are to be used to predict actual climate system behaviour, then we must take very careful account of the uncertainty introduced by their limitations. In this paper we consider the problem of the timing of the next glacial inception, about which there is on-going debate. Our model is the three-dimensional stochastic system of Saltzman and Maasch (1991), and our inference takes place within a Bayesian framework that allows both for the limitations of the model as a description of the propagation of the climate state vector, and for parametric uncertainty. Our inference takes the form of a data assimilation with unknown static parameters, which we perform with a variant on a Sequential Monte Carlo technique (`particle filter'). Provisional results indicate peak glacial conditions in 60,000 years.Comment: superseeds the arXiv:0809.0632 (which was published in European Reviews). The Bayesian section has been significantly expanded. The present version has gone scientific peer review and has been published in European Physics Special Topics. (typo in DOI and in Table 1 (psi -> theta) corrected on 25th August 2009

    Ensemble averaging and mean squared error

    Get PDF
    Abstract In fields such as climate science, it is common to compile an ensemble of different simulators for the same underlying process. It is a striking observation that the ensemble mean often outperforms at least half of the ensemble members in mean squared error (measured with respect to observations). In fact, as demonstrated in the most recent IPCC report, the ensemble mean often outperforms all or almost all of the ensemble members across a range of climate variables. This paper shows that these could be mathematical results based on convexity and averaging but with implications for the properties of the current generation of climate simulators.</jats:p

    Price change and trading volume in a speculative market

    Get PDF
    This thesis is concerned with the daily dynamics of price change and trading volume in a speculative market. The first part examines the news-driven model of Tauchen and Pitts (1983), and develops this model to the point where it is directly testable. In order to implement the test a new method for creating a price index from futures contracts is proposed. It is found that news effects can explain some but not all of the structure of the daily price/volume relationship. An alternative explanation is presented, in which the model of Tauchen and Pitts is generalized in a non-linear fashion. In the second part of the thesis, the presence of a small amount of positive autocorrelation in daily returns is exploited through the development of a timing rule. This timing rule applies to investors who are committed to a purchase but flexible about the precise timing. The computation of the timing rule is discussed in detail. In practice it is found that this timing rule is unlikely to generate sufficiently large returns to be of interest to investors in a typical stock market, supporting the hypothesis of market efficiency. However, the incorporation of extra information regarding price/volume dynamics, as suggested by the analysis of Part I, might lead to a much improved rule

    Confidence in risk assessments

    Get PDF

    Uncertainty of flow in porous media

    Get PDF
    The problem posed to the Study Group was, in essence, how to estimate the probability distribution of f(x) from the probability distribution of x. Here x is a large vector and f is a complicated function which can be expensive to evaluate. For Schlumberger's applications f is a computer simulator of a hydrocarbon reservoir, and x is a description of the geology of the reservoir, which is uncertain

    The exact form of the 'Ockham factor' in model selection

    Full text link
    We explore the arguments for maximizing the `evidence' as an algorithm for model selection. We show, using a new definition of model complexity which we term `flexibility', that maximizing the evidence should appeal to both Bayesian and Frequentist statisticians. This is due to flexibility's unique position in the exact decomposition of log-evidence into log-fit minus flexibility. In the Gaussian linear model, flexibility is asymptotically equal to the Bayesian Information Criterion (BIC) penalty, but we caution against using BIC in place of flexibility for model selection
    • …
    corecore