527 research outputs found

    Measurement of statistical evidence on an absolute scale following thermodynamic principles

    Full text link
    Statistical analysis is used throughout biomedical research and elsewhere to assess strength of evidence. We have previously argued that typical outcome statistics (including p-values and maximum likelihood ratios) have poor measure-theoretic properties: they can erroneously indicate decreasing evidence as data supporting an hypothesis accumulate; and they are not amenable to calibration, necessary for meaningful comparison of evidence across different study designs, data types, and levels of analysis. We have also previously proposed that thermodynamic theory, which allowed for the first time derivation of an absolute measurement scale for temperature (T), could be used to derive an absolute scale for evidence (E). Here we present a novel thermodynamically-based framework in which measurement of E on an absolute scale, for which "one degree" always means the same thing, becomes possible for the first time. The new framework invites us to think about statistical analyses in terms of the flow of (evidential) information, placing this work in the context of a growing literature on connections among physics, information theory, and statistics.Comment: Final version of manuscript as published in Theory in Biosciences (2013

    Study of transition temperatures in superconductors Final report, 11 Mar. 1968 - 10 Mar. 1970

    Get PDF
    Thermodynamic and electrical properties of niobium stannide and other superconductor

    Measurement of Statistical Evidence: Picking Up Where Hacking (et al.) Left Off

    Get PDF
    Hacking’s (1965) Law of Likelihood says – paraphrasing– that data support hypothesis H1 over hypothesis H2 whenever the likelihood ratio (LR) for H1 over H2 exceeds 1. But Hacking (1972) noted a seemingly fatal flaw in the LR itself: it cannot be interpreted as the degree of “evidential significance” across applications. I agree with Hacking about the problem, but I don’t believe the condition is incurable. I argue here that the LR can be properly calibrated with respect to the underlying evidence, and I sketch the rudiments of a methodology for so doing

    Point Contact Spectroscopy of Nb3Sn Crystals: Evidence of a CDW Gap Related to the Martensitic Transition

    Full text link
    Two Single crystals of Nb3Sn presenting the martensitic anomaly at different temperature and shape, as observed with specific heat measurements, were used to study structural features in the electronic density of states with point contact spectroscopy. At high temperature below the martensitic anomaly, we observed different spectroscopic characteristics. One sample displaying a well marked specific heat peak, shows a clear defined structure in the differential conductance that evolves with temperature and may be associated with changes on the density of states due to the opening of a charge density wave gap. Those features are very depending on the crystallographics characteristics of the single crystal examined.Comment: 13 pages 6 figures. accepted in Solid State Communicatio

    Two novel quantitative trait linkage analysis statistics based on the posterior probability of linkage: application to the COGA families

    Get PDF
    BACKGROUND: In this paper we apply two novel quantitative trait linkage statistics based on the posterior probability of linkage (PPL) to chromosome 4 from the GAW 14 COGA dataset. Our approaches are advantageous since they use the full likelihood, use full phenotypic information, do not assume normality at the population level or require population/sample parameter estimates; and like other forms of the PPL, they are specifically tailored to accumulate linkage evidence, either for or against linkage, across multiple sets of heterogeneous data. RESULTS: The first statistic uses all quantitative trait (QT) information from the pedigree (QT-posterior probability of linkage, PPL); we applied the QT-PPL to the trait ecb21 (resting electroencephalogram). The second statistic allows simultaneous incorporation of dichotomous trait data into the QT analysis via a threshold model (QTT-PPL); we applied the QTT-PPL to combined data on ecb21 and ALDX1. We obtained a QT-PPL of 96% at GABRB1 and a QT-PPL of 18% at FABP2 while the QTT-PPL was 4% and 2% at the same two loci, respectively. By comparison, the variance-components (VC) method, as implemented in SOLAR, yielded multipoint VC LOD scores of 2.05 and 2.21 at GABRB1 and FABP2, respectively; no other VC LODs were greater than 2. CONCLUSION: The QTT-PPL was only 4% at GABARB1, which might suggest that the underlying ecb21 gene does not also cause ALDX1, although features of the data complicate interpretation of this result

    KELVIN: A Software Package for Rigorous Measurement of Statistical Evidence in Human Genetics

    Get PDF
    This paper describes the software package KELVIN, which supports the PPL (posterior probability of linkage) framework for the measurement of statistical evidence in human (or more generally, diploid) genetic studies. In terms of scope, KELVIN supports two-point (trait-marker or marker-marker) and multipoint linkage analysis, based on either sex-averaged or sex-specific genetic maps, with an option to allow for imprinting; trait-marker linkage disequilibrium (LD), or association analysis, in case-control data, trio data, and/or multiplex family data, with options for joint linkage and trait-marker LD or conditional LD given linkage; dichotomous trait, quantitative trait and quantitative trait threshold models; and certain types of gene-gene interactions and covariate effects. Features and data (pedigree) structures can be freely mixed and matched within analyses. The statistical framework is specifically tailored to accumulate evidence in a mathematically rigorous way across multiple data sets or data subsets while allowing for multiple sources of heterogeneity, and KELVIN itself utilizes sophisticated software engineering to provide a powerful and robust platform for studying the genetics of complex disorders

    Study of transition temperatures in superconductors Quarterly progress report, 11 Jun. - 10 Sep. 1968

    Get PDF
    Superconducting properties of niobium-tin alloy and large single crystal growt

    Statistical Evidence Measured on a Properly Calibrated Scale Across Nested and Non-nested Hypothesis Comparisons

    Full text link
    Statistical modeling is often used to measure the strength of evidence for or against hypotheses on given data. We have previously proposed an information-dynamic framework in support of a properly calibrated measurement scale for statistical evidence, borrowing some mathematics from thermodynamics, and showing how an evidential analogue of the ideal gas equation of state could be used to measure evidence for a one-sided binomial hypothesis comparison (coin is fair versus coin is biased towards heads). Here we take three important steps forward in generalizing the framework beyond this simple example. We (1) extend the scope of application to other forms of hypothesis comparison in the binomial setting; (2) show that doing so requires only the original ideal gas equation plus one simple extension, which has the form of the Van der Waals equation; (3) begin to develop the principles required to resolve a key constant, which enables us to calibrate the measurement scale across applications, and which we find to be related to the familiar statistical concept of degrees of freedom. This paper thus moves our information-dynamic theory substantially closer to the goal of producing a practical, properly calibrated measure of statistical evidence for use in general applications
    corecore