87 research outputs found

    Should The Widest Cleft in Statistics - How and Why Fisher opposed Neyman and Pearson

    Get PDF
    The paper investigates the “widest cleft”, as Savage put it, between frequencists in the foundation of modern statistics: that opposing R.A. Fisher to Jerzy Neyman and Egon Pearson. Apart from deep personal confrontation through their lives, these scientists could not agree on methodology, on definitions, on concepts and on tools. Their premises and their conclusions widely differed and the two groups they inspired ferociously opposed in all arenas of scientific debate. As the abyss widened, with rare exceptions economists remained innocent of this confrontation. The introduction of probability in economics occurred in fact after these ravaging battles began, even if they were not as public as they became in the 1950s. In any case, when Haavelmo, in the 1940s, suggested a reinterpretation of economics according to the probability concepts, he chose sides and inscribed his concepts in the Neyman-Pearson tradition. But the majority of the profession indifferently used tools developed by each of the opposed groups of statisticians, and many puzzled economists chose to ignore the debate. Economics became, as a consequence, one of the experimental fields for “hybridization”, a synthesis between Fisherian and Neyman-Pearsonian precepts, defined as a number of practical proceedings for statistical testing and inference that were developed notwithstanding the original authors, as an eventual convergence between what they considered to be radically irreconcilable.

    The Geometry of Crashes - A Measure of the Dynamics of Stock Market Crises

    Get PDF
    This paper investigates the dynamics of stocks in the S&P500 index for the last 30 years. Using a stochastic geometry technique, we investigate the evolution of the market space and define a new measure for that purpose, which is a robust index of the dynamics of the market structure and provides information on the intensity and the sectoral impact of the crises. With this measure, we analyze the effects of some extreme phenomena on the geometry of the market. Nine crashes between 1987 and 2001 are compared by looking at the way they modify the shape of the manifold that describes the S&P500 market space. These crises are identified as (a) structural, (b) general and (c) local.financial markets; stochastic geometry; complexity; market spaces; market structures.

    The Seismography of Crashes in Financial Markets

    Get PDF
    This paper investigates the dynamics of stocks in the S&P500 for the last 33 years, considering the population of all companies present in the index for the whole period. Using a stochastic geometry tech- nique and defining a robust index of the dynamics of the market struc- ture, which is able to provide information about the intensity of the crises, the paper proposes a seismographic classification of the crashes that occurred during the period. The index is used in order to inves- tigate and to classify the impact of the twelve crashes between July 1973 and March 2006 and to discuss the available evidence of change of structure after the fin de sicle.Keywords: financial markets; stochastic geometry; complexity; market spaces; market structures.

    Tribes under Threat – The Collective Behavior of Firms During the Stock Market Crisis

    Get PDF
    Due to their unpredictable behavior, stock markets are examples of complex systems. Yet, the dominant analysis of these markets as- sumes simple stochastic variations, eventually tainted by short-lived memory. This paper proposes an alternative strategy, based on a stochastic geometry deÂŻning a robust index of the structural dynamics of the markets and based on notions of topology deÂŻning a new coef- ficient that identifies the structural changes occurring on the S&P500 set of stocks. The results demonstrate the consistency of the random hypothesis as applied to normal periods but they also show its in- adequacy as to the analysis of periods of turbulence, for which the emergence of collective behavior of sectoral clusters of firms is mea- sured. This behavior is identified as a meta-routine.

    Intriguing pendula : founding metaphors in the analysis of economic fluctuations

    Get PDF
    The paper is an inquiry into the definition of the early econometric programme, namely into the discussions which Frisch and Schumpeter held in the early 1930s about the most suitable model for representing innovations, change and equilibrium in economics. The argument and its framework are briefly presented in the first section. The 1931 correspondence between the two founders of the Econometric Society is discussed in the second section. It provides a magnificent example of the importance of rhetorics in economics, of the heuristic role of constitutive metaphors in a research programme and of the difficulties in defining the most suitable mathematical formal ism for dealing with cycles and structural change. The third section presents the conclusion of the story: the bifurcation between the resulting contributions made by Frisch (Propagation problems and impulse problems in dynamic economics, pp. 171-205 in Koch, K. (ed.), Economic Essays in Honour of Gustav Cassel, London, Frank Cass, 1933) and Schumpeter {Business Cycles, New York, McGraw, 1939; and the posthumous volume, History of Economic Analysis, London, Routledge, 1954). Finally, the fourth section presents an alternative epilogue, highlighting some of the hidden implications of these verbal accounts of pendula as the founding metaphor for business cycles. The paper is based upon as yet unpublished papers that were found in Frisch's Collections (Oslo University Library and Frisch's Rommet at the Institute of Eco nomics) and Schumpeter's Collection (Harvard Universit)..info:eu-repo/semantics/publishedVersio

    As time went by : why is the long wave so long?

    Get PDF
    Since the end of the expansionary period after the Second War, the world economy went through a long period of mediocre growth with a major recession in 2009 and another in 2020. This period is examined following As Time Goes By, the last contribution by Chris Freeman, with the cooperation of this co-author. As a long period of readjustment after the beginning of a structural crisis is imposed by the mismatch between the capabilities of the emerging techno-economic paradigm and the socioinstitutional framework, my argument is that the duration of this transition is explained by the difficult process of replacing a successful institutional setting, that which supported the post-War expansion, by the new accumulation regime that is being constituted. Instead of most of the literature on long waves, which tries to uncover some mechanics of succession of radical technological innovations, this paper addresses different questions: how does the socio-institutional adaptation proceed, and how relevant is this process to explain the length of the downswing since the turning point of the 1970s. In order to investigate such process of readjustment, the conditions for the new rule of financial accumulation are discussed, including the forms and duration of the process of selection, reproduction, and education of the elite, and changes in institutions, norms, and social networks.info:eu-repo/semantics/publishedVersio

    The years of high econometrics: A short history of the generation that reinvented economic

    Get PDF
    This book is an essay in biography and its subject matter is the collective effort of that brilliant generation of economists who aspired to transform economics into a rigorous science. The powerful econometric movement took shape in the 1930s, the years of high theory – the concept that Shackle used to describe the period of the inception of the Keynesian revolution, a period that cannot be thoroughly understood unless both movements are contrasted. In a sense, both the Keynesian revolution and the econometric revolution shared the same motivation: to extend the empirical capacity of economics, broadening its analytical scope and strengthening its capacity for designing a control policy. As the story unfurls, it becomes obvious that the young econometricians with Keynesian leanings were more radically engaged in such a task than the Cambridge circle itself, and this was the profound reason for a great deal of the harsh criticism and disappointment that they faced. Furthermore, the acceptance of the epistemological primacy of a very peculiar type of simple mathematical formalism contributed to the marginalisation of some of the major theoretical alternatives developed in the first half of the century. Evidence shows that the endorsement of the urgent political agenda for action against unemployment and the dangers of war were instrumental in determining the victory of a specific mathematical drive, and that the econometric programme as it came to be conceived in these incipient years was shaped by this movement. As a consequence of its impact, econometrics became a tool for the reconstruction of neoclassical economics, which sought to be redescribed in the language of mathematical formalism and statistical inference and estimation, and simultaneously responsible for the decay of heterodox alternatives elsewhere. In that sense, modern economics was a tributary of that success.info:eu-repo/semantics/publishedVersio

    Chris Freeman forging the evolution of evolutionary economics

    Get PDF
    Every Schumpeterian is an evolutionary economist in his or her own way. Chris Freeman, whose 1995 essay is published in this issue of ICC, favored a rare combination of the Cambridge tradition, a Marxian view of inequalities and Schumpeter’s fascination with innovation as the driving force of capitalism. The article summarizes and discusses this combination and how Freeman generated a challenging agenda for contemporary economics, namely in the context of long wave analysis, the theme for his last book.info:eu-repo/semantics/publishedVersio
    • 

    corecore