101 research outputs found

    A principal component analysis of 39 scientific impact measures

    Get PDF
    The impact of scientific publications has traditionally been expressed in terms of citation counts. However, scientific activity has moved online over the past decade. To better capture scientific impact in the digital era, a variety of new impact measures has been proposed on the basis of social network analysis and usage log data. Here we investigate how these new measures relate to each other, and how accurately and completely they express scientific impact. We performed a principal component analysis of the rankings produced by 39 existing and proposed measures of scholarly impact that were calculated on the basis of both citation and usage log data. Our results indicate that the notion of scientific impact is a multi-dimensional construct that can not be adequately measured by any single indicator, although some measures are more suitable than others. The commonly used citation Impact Factor is not positioned at the core of this construct, but at its periphery, and should thus be used with caution

    "Needless to Say My Proposal Was Turned Down": The Early Days of Commercial Citation Indexing, an "Error-making" Activity and Its Repercussions Till Today

    Get PDF
    In today’s neoliberal audit cultures university rankings, quantitative evaluation of publications by JIF or researchers by h-index are believed to be indispensable instruments for “quality assurance” in the sciences. Yet there is increasing resistance against “impactitis” and “evaluitis”. Usually overseen: Trivial errors in Thomson Reuters’ citation indexes produce severe non-trivial effects: Their victims are authors, institutions, journals with names beyond the ASCII-code and scholars of humanities and social sciences. Analysing the “Joshua Lederberg Papers” I want to illuminate eventually successful ‘invention’ of science citation indexing is a product of contingent factors. To overcome severe resistance Eugene Garfield, the “father” of citation indexing, had to foster overoptimistic attitudes and to downplay the severe problems connected to global and multidisciplinary citation indexing. The difficulties to handle different formats of references and footnotes, non-Anglo-American names, and of publications in non-English languages were known to the pioneers of citation indexing. Nowadays the huge for-profit North-American media corporation Thomson Reuters is the owner of the citation databases founded by Garfield. Thomson Reuters’ influence on funding decisions, individual careers, departments, universities, disciplines and countries is immense and ambivalent. Huge technological systems show a heavy inertness. This insight of technology studies is applicable to the large citation indexes by Thomson Reuters, too

    A Review of Theory and Practice in Scientometrics

    Get PDF
    Scientometrics is the study of the quantitative aspects of the process of science as a communication system. It is centrally, but not only, concerned with the analysis of citations in the academic literature. In recent years it has come to play a major role in the measurement and evaluation of research performance. In this review we consider: the historical development of scientometrics, sources of citation data, citation metrics and the “laws" of scientometrics, normalisation, journal impact factors and other journal metrics, visualising and mapping science, evaluation and policy, and future developments

    The metric tide: report of the independent review of the role of metrics in research assessment and management

    Get PDF
    This report presents the findings and recommendations of the Independent Review of the Role of Metrics in Research Assessment and Management. The review was chaired by Professor James Wilsdon, supported by an independent and multidisciplinary group of experts in scientometrics, research funding, research policy, publishing, university management and administration. This review has gone beyond earlier studies to take a deeper look at potential uses and limitations of research metrics and indicators. It has explored the use of metrics across different disciplines, and assessed their potential contribution to the development of research excellence and impact. It has analysed their role in processes of research assessment, including the next cycle of the Research Excellence Framework (REF). It has considered the changing ways in which universities are using quantitative indicators in their management systems, and the growing power of league tables and rankings. And it has considered the negative or unintended effects of metrics on various aspects of research culture. The report starts by tracing the history of metrics in research management and assessment, in the UK and internationally. It looks at the applicability of metrics within different research cultures, compares the peer review system with metric-based alternatives, and considers what balance might be struck between the two. It charts the development of research management systems within institutions, and examines the effects of the growing use of quantitative indicators on different aspects of research culture, including performance management, equality, diversity, interdisciplinarity, and the ‘gaming’ of assessment systems. The review looks at how different funders are using quantitative indicators, and considers their potential role in research and innovation policy. Finally, it examines the role that metrics played in REF2014, and outlines scenarios for their contribution to future exercises

    Australian education journals: quantitative and qualitative indicators

    Get PDF
    This paper reports on a study which applied citation-based measures to Australian education journals. Citations data were drawn from two sources, Web of Science and Scopus, and these data were used to calculate each journal's impact factor, h-index, and diffusion factor. The rankings resulting from these analyses were compared with draft rankings assigned to the journals for Excellence for Research in Australia (ERA). Scopus emerged as the citation source most advantageous to these journals and some consistency across the citation-based measures was found
    • 

    corecore