438 research outputs found

    Book reviews

    Full text link
    Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/45714/1/11336_2005_Article_BF02289702.pd

    The properties of the "standard" type Ic supernova 1994I from spectral models

    Full text link
    The properties of the type Ic supernova SN 1994I are re-investigated. This object is often referred to as a "standard SN Ic" although it exhibited an extremely fast light curve and unusually blue early-time spectra. In addition, the observations were affected by significant dust extinction. A series of spectral models are computed based on the explosion model CO21 (Iwamoto et al. 1994) using a Monte Carlo transport spectral synthesis code. Overall the density structure and abundances of the explosion model are able to reproduce the photospheric spectra well. Reddening is estimated to be E(B-V)=0.30 mag, a lower value than previously proposed. A model of the nebular spectrum of SN 1994I points toward a slightly larger ejecta mass than that of CO21. The photospheric spectra show a large abundance of iron-group elements at early epochs, indicating that mixing within the ejecta must have been significant. We present an improved light curve model which also requires the presence of 56Ni in the outer layers of the ejecta.Comment: 11 pages, 8 figures. Acccepted for publication in MNRA

    Personal probabilities of probabilities

    Full text link
    By definition, the subjective probability distribution of a random event is revealed by the (‘rational’) subject's choice between bets — a view expressed by F. Ramsey, B. De Finetti, L. J. Savage and traceable to E. Borel and, it can be argued, to T. Bayes. Since hypotheses are not observable events, no bet can be made, and paid off, on a hypothesis. The subjective probability distribution of hypotheses (or of a parameter, as in the current ‘Bayesian’ statistical literature) is therefore a figure of speech, an ‘as if’, justifiable in the limit. Given a long sequence of previous observations, the subjective posterior probabilities of events still to be observed are derived by using a mathematical expression that would approximate the subjective probability distribution of hypotheses, if these could be bet on. This position was taken by most, but not all, respondents to a ‘Round Robin’ initiated by J. Marschak after M. H. De-Groot's talk on Stopping Rules presented at the UCLA Interdisciplinary Colloquium on Mathematics in Behavioral Sciences. Other participants: K. Borch, H. Chernoif, R. Dorfman, W. Edwards, T. S. Ferguson, G. Graves, K. Miyasawa, P. Randolph, L. J. Savage, R. Schlaifer, R. L. Winkler. Attention is also drawn to K. Borch's article in this issue.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/43847/1/11238_2004_Article_BF00169102.pd

    Towards Machine Wald

    Get PDF
    The past century has seen a steady increase in the need of estimating and predicting complex systems and making (possibly critical) decisions with limited information. Although computers have made possible the numerical evaluation of sophisticated statistical models, these models are still designed \emph{by humans} because there is currently no known recipe or algorithm for dividing the design of a statistical model into a sequence of arithmetic operations. Indeed enabling computers to \emph{think} as \emph{humans} have the ability to do when faced with uncertainty is challenging in several major ways: (1) Finding optimal statistical models remains to be formulated as a well posed problem when information on the system of interest is incomplete and comes in the form of a complex combination of sample data, partial knowledge of constitutive relations and a limited description of the distribution of input random variables. (2) The space of admissible scenarios along with the space of relevant information, assumptions, and/or beliefs, tend to be infinite dimensional, whereas calculus on a computer is necessarily discrete and finite. With this purpose, this paper explores the foundations of a rigorous framework for the scientific computation of optimal statistical estimators/models and reviews their connections with Decision Theory, Machine Learning, Bayesian Inference, Stochastic Optimization, Robust Optimization, Optimal Uncertainty Quantification and Information Based Complexity.Comment: 37 page

    Initial performance of the COSINE-100 experiment

    Get PDF
    COSINE is a dark matter search experiment based on an array of low background NaI(Tl) crystals located at the Yangyang underground laboratory. The assembly of COSINE-100 was completed in the summer of 2016 and the detector is currently collecting physics quality data aimed at reproducing the DAMA/LIBRA experiment that reported an annual modulation signal. Stable operation has been achieved and will continue for at least 2 years. Here, we describe the design of COSINE-100, including the shielding arrangement, the configuration of the NaI(Tl) crystal detection elements, the veto systems, and the associated operational systems, and we show the current performance of the experiment

    From Wald to Savage: homo economicus becomes a Bayesian statistician

    Get PDF
    Bayesian rationality is the paradigm of rational behavior in neoclassical economics. A rational agent in an economic model is one who maximizes her subjective expected utility and consistently revises her beliefs according to Bayes’s rule. The paper raises the question of how, when and why this characterization of rationality came to be endorsed by mainstream economists. Though no definitive answer is provided, it is argued that the question is far from trivial and of great historiographic importance. The story begins with Abraham Wald’s behaviorist approach to statistics and culminates with Leonard J. Savage’s elaboration of subjective expected utility theory in his 1954 classic The Foundations of Statistics. It is the latter’s acknowledged fiasco to achieve its planned goal, the reinterpretation of traditional inferential techniques along subjectivist and behaviorist lines, which raises the puzzle of how a failed project in statistics could turn into such a tremendous hit in economics. A couple of tentative answers are also offered, involving the role of the consistency requirement in neoclassical analysis and the impact of the postwar transformation of US business schools
    • 

    corecore