9,418 research outputs found

    The Jeffreys-Lindley Paradox and Discovery Criteria in High Energy Physics

    Full text link
    The Jeffreys-Lindley paradox displays how the use of a p-value (or number of standard deviations z) in a frequentist hypothesis test can lead to an inference that is radically different from that of a Bayesian hypothesis test in the form advocated by Harold Jeffreys in the 1930s and common today. The setting is the test of a well-specified null hypothesis (such as the Standard Model of elementary particle physics, possibly with "nuisance parameters") versus a composite alternative (such as the Standard Model plus a new force of nature of unknown strength). The p-value, as well as the ratio of the likelihood under the null hypothesis to the maximized likelihood under the alternative, can strongly disfavor the null hypothesis, while the Bayesian posterior probability for the null hypothesis can be arbitrarily large. The academic statistics literature contains many impassioned comments on this paradox, yet there is no consensus either on its relevance to scientific communication or on its correct resolution. The paradox is quite relevant to frontier research in high energy physics. This paper is an attempt to explain the situation to both physicists and statisticians, in the hope that further progress can be made.Comment: v4: Continued editing for clarity. Figure added. v5: Minor fixes to biblio. Same as published version except for minor copy-edits, Synthese (2014). v6: fix typos, and restore garbled sentence at beginning of Sec 4 to v

    Negatively Biased Relevant Subsets Induced by the Most-Powerful One-Sided Upper Confidence Limits for a Bounded Physical Parameter

    Full text link
    Suppose an observable x is the measured value (negative or non-negative) of a true mean mu (physically non-negative) in an experiment with a Gaussian resolution function with known fixed rms deviation s. The most powerful one-sided upper confidence limit at 95% C.L. is UL = x+1.64s, which I refer to as the "original diagonal line". Perceived problems in HEP with small or non-physical upper limits for x<0 historically led, for example, to substitution of max(0,x) for x, and eventually to abandonment in the Particle Data Group's Review of Particle Physics of this diagonal line relationship between UL and x. Recently Cowan, Cranmer, Gross, and Vitells (CCGV) have advocated a concept of "power constraint" that when applied to this problem yields variants of diagonal line, including UL = max(-1,x)+1.64s. Thus it is timely to consider again what is problematic about the original diagonal line, and whether or not modifications cure these defects. In a 2002 Comment, statistician Leon Jay Gleser pointed to the literature on recognizable and relevant subsets. For upper limits given by the original diagonal line, the sample space for x has recognizable relevant subsets in which the quoted 95% C.L. is known to be negatively biased (anti-conservative) by a finite amount for all values of mu. This issue is at the heart of a dispute between Jerzy Neyman and Sir Ronald Fisher over fifty years ago, the crux of which is the relevance of pre-data coverage probabilities when making post-data inferences. The literature describes illuminating connections to Bayesian statistics as well. Methods such as that advocated by CCGV have 100% unconditional coverage for certain values of mu and hence formally evade the traditional criteria for negatively biased relevant subsets; I argue that concerns remain. Comparison with frequentist intervals advocated by Feldman and Cousins also sheds light on the issues.Comment: 22 pages, 7 figure

    Molecular gastronomy : basis for a new culinary movement or modern day alchemy?

    Get PDF
    To explore the phenomenon of molecular gastronomy by conducting empirical research focusing on renowned chefs. Design/methodology/approach - Literature review summarising past culinary innovations then focusing on the origins and evolution of molecular gastronomy, followed by 18 phenomenological interviews with a snowball sample of world class chefs from across Europe. There is far greater confusion about what molecular gastronomy might be than is implied in previous studies. The term has become wrongly used to describe a possible culinary movement mainly as a result of media influence. Leading chefs, whose new restaurant concepts have become associated with it, reject the term. With only 20 years of history molecular gastronomy is still a comparatively new phenomenon, this initial research presents a clear picture of its evolution so far and the increasing confusion the use of the term has created. It's still far too early to decide if these are heralding a new gastronomic movement. Although molecular gastronomy itself may not provide a foundation for a genuine and lasting development of cuisine it is generating fascination with the fundamental science and techniques of cuisine and showy culinary alchemy. As with Nouvelle Cuisine poor quality copycat chefs could bring into disrepute the reputation and practices of those who are at the vanguard culinary and restaurant innovation. Originality/value - First widespread primary study, across five countries, into recognised exceptional chefs' understanding of molecular gastronomy. It clarifies that molecular gastronomy was never intended to be the foundation of a culinary movement and identifies four key elements for the development of lasting cuisine movements and trends

    PhysStat-LHC Conference Summary

    Get PDF
    This timely conference in the PhyStat series brought together physicists and statisticians for talks and discussions having an emphasis on techniques for use at the Large Hadron Collider experiments. By building on the work of previous generations of experiments, and by developing common tools for comparing and combining results, we can be optimistic about our readiness for statistical analysis of the rst LHC data

    Comments on methods for setting confidence limits

    Get PDF
    • …
    corecore