1,436 research outputs found

    Keynes among the statisticians

    No full text
    This paper considers J. M. Keynes as a statistician and philosopher of statistics and the reaction of English statisticians to his critique of their work. It follows the development of Keynes's thinking through the two versions of his fellowship dissertation The Principles of Probability (1907/8) to his book A Treatise on Probability (1921). It places Keynes's ideas in the context of contemporary English and Continental statistical thought. Of the statisticians considered special attention is paid to the reactions of four: Edgeworth, Bowley, Jeffreys and R. A. Fisher<br/

    The comparative academic standing of athletes and non-athletes of the class of 1940 at Dartmouth college

    Full text link
    Thesis (M.A.)--Boston University, 1947. This item was digitized by the Internet Archive

    Professor A.L. Bowley’s theory of the representative method

    Get PDF
    Arthur. L. Bowley (1869-1957) first advocated the use of surveys--the "representative method"--in 1906 and started to conduct surveys of economic and social conditions in 1912. Bowley's 1926 memorandum for the International Statistical Institute on the "Measurement of the precision attained in sampling" was the first large-scale theoretical treatment of sample surveys as he conducted them. This paper examines Bowley's arguments in the context of the statistical inference theory of the time. The great influence on Bowley's conception of statistical inference was F. Y. Edgeworth but by 1926 R. A. Fisher was on the scene and was attacking Bayesian methods and promoting a replacement of his own. Bowley defended his Bayesian method against Fisher and against Jerzy Neyman when the latter put forward his concept of a confidence interval and applied it to the representative method <br><br> Keywords; history of statistics, sampling theory, bayesian inference

    How likelihood and identification went Bayesian

    Get PDF
    This paper considers how the concepts of likelihood and identification became part of Bayesian theory. This makes a nice study in the development of concepts in statistical theory. Likelihood slipped in easily but there was a protracted debate about how identification should be treated. Initially there was no agreement on whether identification involved the prior, the likelihood or the posterior.

    Escaping from American intelligence : culture, ethnocentrism and the Anglosphere

    Get PDF
    The United States and its closest allies now spend over $100 billion a year on intelligence. Ten years after 9/11, the intelligence machine is certainly bigger - but not necessarily better. American intelligence continues to privilege old-fashioned strategic analysis for policy-makers and exhibits a technocratic approach to asymmetric security threats, epitomized by the accelerated use of drone strikes and data-mining. Distinguished commentators have focused on the panacea of top-down reform, while politicians and practitioners have created entire new agencies. However these prescriptions for change remain conceptually limited because of underlying Anglo-Saxon presumptions about what intelligence is. Although intelligence is a global business, when we talk about intelligence we tend to use a vocabulary that is narrowly derived from the experiences of America and its English-speaking nebula. This article deploys the notion of strategic culture to explain this why this is. It then explores the cases of China and South Africa to suggest how we might begin to rethink our intelligence communities and their tasks. It argues that the road to success is about individuals, attitudes and cultures rather than organizations. Future improvement will depend on our ability to recognize the changing nature of the security environment and to practice the art of ‘intelligence among the people’. While the United States remains the world’s most significant military power, its strategic culture is unsuited to this new terrain and arguably other countries do these things rather better

    The origins of fixed X regression

    No full text
    In 1922 R. A. Fisher introduced the fixed X regression model, synthesising the regression theory of Pearson and Yule with the least squares theory of Gauss. The innovation was based on Fisher's realisation that the distribution associated with the regression coefficient was unaffected by the distribution of X. Subsequently Fisher interpreted the fixed X assumption in terms of his notion of ancillarity. This paper considers these developments against the background of early twentieth century statistical theor

    Geographic variation of bewick wrens in the eastern United States

    Get PDF
    corecore