39 research outputs found
Statistical inference on the h-index with an application to top-scientist performance
Despite the huge amount of literature on h-index, few papers have been
devoted to the statistical analysis of h-index when a probabilistic
distribution is assumed for citation counts. The present contribution relies on
showing the available inferential techniques, by providing the details for
proper point and set estimation of the theoretical h-index. Moreover, some
issues on simultaneous inference - aimed to produce suitable scholar
comparisons - are carried out. Finally, the analysis of the citation dataset
for the Nobel Laureates (in the last five years) and for the Fields medallists
(from 2002 onward) is proposed.Comment: 14 pages, 3 table
Statistical analysis of the Hirsch Index
The Hirsch index (commonly referred to as h-index) is a bibliometric
indicator which is widely recognized as effective for measuring the scientific
production of a scholar since it summarizes size and impact of the research
output. In a formal setting, the h-index is actually an empirical functional of
the distribution of the citation counts received by the scholar. Under this
approach, the asymptotic theory for the empirical h-index has been recently
exploited when the citation counts follow a continuous distribution and, in
particular, variance estimation has been considered for the Pareto-type and the
Weibull-type distribution families. However, in bibliometric applications,
citation counts display a distribution supported by the integers. Thus, we
provide general properties for the empirical h-index under the small- and
large-sample settings. In addition, we also introduce consistent nonparametric
variance estimation, which allows for the implemention of large-sample set
estimation for the theoretical h-index
Egyetemi rangsorok tudomĂĄnymetriai ĂŠs statisztikai megalapozĂĄssal
Az egyetemi rangsorok egyik, esetenkĂŠnt kizĂĄrĂłlagosan meghatĂĄrozĂł szempontja a tudomĂĄnyos teljesĂtmĂŠny. Ennek mĂŠrĂŠsĂŠre, elemzĂŠsĂŠre szĂźletett a tudomĂĄnymetria. A szerzĹk elĹszĂśr rĂśviden felvĂĄzoljĂĄk e viszonylag fiatal tudomĂĄnyterĂźlet kezdeteit, majd nĂŠhĂĄny mĂłdszertani pĂŠldĂĄt ismertetnek az MTA-PE Budapest Rangsor KutatĂłcsoport munkĂĄibĂłl, amelyek elsĹsorban az egyetemek tudomĂĄnyos teljesĂtmĂŠnyĂŠvel fĂźggnek Ăśssze. A tanulmĂĄny a kutatĂłcsoportnak azt a tĂśrekvĂŠsĂŠt mutatja be, hogy olyan modern statisztikai mĂłdszereket dolgozzon ki ĂŠs hasznĂĄljon, amelyek segĂtsĂŠgĂŠvel az egyetemi rangsorok alapvetĹ hiĂĄnyossĂĄgai kikĂźszĂśbĂślhetĹk vagy legalĂĄbbis enyhĂthetĹk. A tudomĂĄnymetriai mutatĂłk szerepĂŠt nĂŠgy terĂźleten vizsgĂĄljĂĄk: a globĂĄlis egyetemi ligĂĄk kialakulĂĄsĂĄt, a hazai hallgatĂłi jelentkezĂŠsĂŠt, az eurĂłpai Erasmus utazĂĄsokĂŠt, valamint egy globĂĄlis egyetemi rangsor top 100, illetve top 200 helyezĂŠsĂŠt tekintve, mely utĂłbbi esetĂŠn a magyar egyetemek teljesĂtmĂŠnyĂŠt nĂŠmet ĂŠs belga intĂŠzmĂŠnyekkel hasonlĂtjĂĄk Ăśssze. EredmĂŠnyeik azt mutatjĂĄk, hogy a tudomĂĄnymetriai mutatĂłknak leginkĂĄbb az egyetemi ligĂĄk kialakulĂĄsĂĄban van szerepĂźk, a felsĹoktatĂĄsba valĂł jelentkezĂŠsekre csak kĂśzvetve ĂŠs idĹben egyre csĂśkkenĹ jelentĹsĂŠggel hatnak, az Erasmus hallgatĂłi utazĂĄsokkal pedig nem fĂźggnek Ăśssze
A Review of Theory and Practice in Scientometrics
Scientometrics is the study of the quantitative aspects of the process of science as a communication system. It is centrally, but not only, concerned with the analysis of citations in the academic literature. In recent years it has come to play a major role in the measurement and evaluation of research performance. In this review we consider: the historical development of scientometrics, sources of citation data, citation metrics and the âlaws" of scientometrics, normalisation, journal impact factors and other journal metrics, visualising and mapping science, evaluation and policy, and future developments
Recommended from our members
When Physics Became Undisciplined An Essay on Econophysics
In the 1990s, physicists started looking beyond their disciplinary boundaries by using their methods to study various problems usually thrown up by financial economics. This dissertation deals with this extension of physics outside its disciplinary borders. It seeks to determine what sort of discipline econophysics is in relation to physics and to economics, how its emergence was made possible, and what sort of knowledge it produces. Using a variety of evidence including bibliometric analysis Chapter 1 explores the fieldâs disciplinary identity as a branch of physics even though its intellectual heart is better seen as the re-emergence of a 1960s research programme initiated in economics. Chapter 2 is historical: it identifies the key role played by the Santa Fe Institute and its pioneering complexity research in the shaping of methodological horizons of econophysics. These are in turn investigated in Chapter 3, which argues that there are in fact three methodological strands: statistical econophysics, bottom-up agent-based econophysics, and top-down agent-based econophysics. Viewed from a Lakatosian perspective they all share a conceptual hard-core but articulate the protective belt in distinctly different ways. The last and final chapter is devoted to the way econophysicists produce and justify their knowledge. It shows that econophysics operates by proposing empirically adequate analogies between physical and other systems in exactly the ways emphasised by Pierre Duhem. The contrast between such use of analogy in econophysics and modeling practices implemented by financial economics explains why econophysics remains so controversial to economists
Tag based Bayesian latent class models for movies : economic theory reaches out to big data science
For the past 50 years, cultural economics has developed as an independent research specialism. At its core are the creative industries and the peculiar economics associated with them, central to which is a tension that arises from the notion that creative goods need to be experienced before an assessment can be made about the utility they deliver to the consumer. In this they differ from the standard private good that forms the basis of demand theory in economic textbooks, in which utility is known ex ante. Furthermore, creative goods are typically complex in composition and subject to heterogeneous and shifting consumer preferences. In response to this, models of linear optimization, rational addiction and Bayesian learning have been applied to better understand consumer decision- making, belief formation and revision. While valuable, these approaches do not lend themselves to forming verifiable hypothesis for the critical reason that they by-pass an essential aspect of creative products: namely, that of novelty. In contrast, computer sciences, and more specifically recommender theory, embrace creative products as a study object. Being items of online transactions, users of creative products share opinions on a massive scale and in doing so generate a flow of data driven research. Not limited by the multiple assumptions made in economic theory, data analysts deal with this type of commodity in a less constrained way, incorporating the variety of item characteristics, as well as their co-use by agents. They apply statistical techniques supporting big data, such as clustering, latent class analysis or singular value decomposition.
This thesis is drawn from both disciplines, comparing models, methods and data sets. Based upon movie consumption, the work contrasts bottom-up versus top-down approaches, individual versus collective data, distance measures versus the utility-based comparisons. Rooted in Bayesian latent class models, a synthesis is formed, supported by the random utility theory and recommender algorithm methods. The Bayesian approach makes explicit the experience good nature of creative goods by formulating the prior uncertainty of users towards both movie features and preferences. The latent class method, thus, infers the heterogeneous aspect of preferences, while its dynamic variant- the latent Markov model - gets around one of the main paradoxes in studying creative products: how to analyse taste dynamics when confronted with a good that is novel at each decision point. Generated by mainly movie-user-rating and movie-user-tag triplets, collected from the Movielens recommender system and made available as open data for research by the GroupLens research team, this study of preference patterns formation for creative goods is drawn from individual level data
The economics of federalism
An important development in public finance theory during recent years has been the emergence of the basic elements of a theory of fiscal federalism, based partly on the theory of public goods, partly on the theory of political process and partly on various aspects of location theory. The aim of the theory is to supply answers to basic and wide-ranging questions relating to the case for and the allocation of functions within a federal system, efficiency aspects of migration between jurisdictions, the case for different kinds of intergovernmental grants arrangements and the forms of debt and taxation arrangements appropriate to a federal structure. This volume gathers together most of the significant contributions to the theory, many of which are somewhat inaccessible. Although primarily concerned with federal constitutions, the book is relevant to the analysis of public policy under unitary constitutions which devolve decision-making autonomy to local or regional governments. It also reviews the current state of the art and thereby points out certain gaps that remain to be filled in the future
Public Facility Location: Issues and Approaches
The papers collected in this issue were presented at the Task Force Meeting on Public Facility Location, held at IIASA in June 1980. The meeting was an important occasion for scientists with different backgrounds and nationalities to compare and discuss differences and similarities among their approaches to location problems. Unification and reconciliation of existing theories and methods was one of the leading themes of the meeting, and the papers collected here are part of the raw material to be used as a starting point towards this aim. The papers themselves provide a wide spectrum of approaches to both technical and substantive problems, for example, the way space is treated (continuously in Beckmann, in Mayhew, and in Thisse et al, discretely in all the others), the way customers are assigned to facilities (by behavioral models in Ermoliev and Leonardi, in Sheppard, and in Wilson, by normative rules in many others), the way the objective function is defined (ranging from total cost, to total profit, total expected utility for customers, accessibility, minimax distance, maximum covering, to a multi-objective treatment of all of them as in Revelle et al. There is indeed room for discussion, in order to find both similarities and weaknesses in different approaches.
A general weakness of the current state of the art of location modeling may also be recognized: its general lack of realism relative to the political and institutional issues implied by locational decisions. This criticism, developed by Lea, might be used both as a concluding remark and as a proposal for new challenging research themes to scholars working in the field of location theory
The Shareholder Value Revolution
My dissertation blends intellectual and business history to explain why American businesses chose to endorse the idea that corporations existed primarily to maximize the value of their shareholdersâ investments during the final decades of the twentieth century. Shareholder value maximization was not just a financial strategy that corporate leaders adopted, it was an entirely different way of understanding what a corporation was and what its place in society should be. This new way of understanding the corporation was the product of decades of research in financial economics. Financial economists argued that corporations owed nothing to society beyond the maximization of shareholder value and that managers should focus on boosting their firmsâ stock price above all else. In the 1980s and 1990s, activist investors, management consultants, legal theorists, and financial economists themselves used these ideas to justify hostile takeovers of businesses with low stock prices and to pressure managers to restructure their businesses around the promotion of shareholder value. Through an examination of these consultantsâ methods and the theory behind them, this dissertation demonstrates how ideas in favor of shareholder value maximization redefined fundamental business concepts such as profit and value. This process of redefinition transformed actions once seen as signs of corporate failure like layoffs and divestitures into signs that managers were willing to make the difficult choices needed to redirect corporate funds away from workers and âunproductiveâ investment and to shareholders. Consultants and academics also encouraged companiesâ boards of directors to grant top executives compensation in the form of stock options to encourage managers to engage in the often-painful sorts of restructurings needed to maximize shareholder value. Thanks to boardsâ generosity with these stock options, executives were able to take home unprecedentedly large levels of compensation during the late 1980s and the 1990s, while being incentivized to shed their obligations to workers and to maximize value for shareholders. The root of contemporary concerns about economic inequality and a slowdown in economic growth trace back to these decisions that executives made to embrace shareholder value maximization