3,170 research outputs found

    Towards a new crown indicator: Some theoretical considerations

    Get PDF
    The crown indicator is a well-known bibliometric indicator of research performance developed by our institute. The indicator aims to normalize citation counts for differences among fields. We critically examine the theoretical basis of the normalization mechanism applied in the crown indicator. We also make a comparison with an alternative normalization mechanism. The alternative mechanism turns out to have more satisfactory properties than the mechanism applied in the crown indicator. In particular, the alternative mechanism has a so-called consistency property. The mechanism applied in the crown indicator lacks this important property. As a consequence of our findings, we are currently moving towards a new crown indicator, which relies on the alternative normalization mechanism

    Untangling the Web of E-Research: Towards a Sociology of Online Knowledge

    Full text link
    e-Research is a rapidly growing research area, both in terms of publications and in terms of funding. In this article we argue that it is necessary to reconceptualize the ways in which we seek to measure and understand e-Research by developing a sociology of knowledge based on our understanding of how science has been transformed historically and shifted into online forms. Next, we report data which allows the examination of e-Research through a variety of traces in order to begin to understand how the knowledge in the realm of e-Research has been and is being constructed. These data indicate that e-Research has had a variable impact in different fields of research. We argue that only an overall account of the scale and scope of e-Research within and between different fields makes it possible to identify the organizational coherence and diffuseness of e-Research in terms of its socio-technical networks, and thus to identify the contributions of e-Research to various research fronts in the online production of knowledge

    On tit for tat: Franceschini and Maisano versus ANVUR regarding the Italian research assessment exercise VQR 2011-2014

    Full text link
    The response by Benedetto, Checchi, Graziosi & Malgarini (2017) (hereafter "BCG&M"), past and current members of the Italian Agency for Evaluation of University and Research Systems (ANVUR), to Franceschini and Maisano's ("F&M") article (2017), inevitably draws us into the debate. BCG&M in fact complain "that almost all criticisms to the evaluation procedures adopted in the two Italian research assessments VQR 2004-2010 and 2011-2014 limit themselves to criticize the procedures without proposing anything new and more apt to the scope". Since it is us who raised most criticisms in the literature, we welcome this opportunity to retrace our vainly "constructive" recommendations, made with the hope of contributing to assessments of the Italian research system more in line with the state of the art in scientometrics. We see it as equally interesting to confront the problem of the failure of knowledge transfer from R&D (scholars) to engineering and production (ANVUR's practitioners) in the Italian VQRs. We will provide a few notes to help the reader understand the context for this failure. We hope that these, together with our more specific comments, will also assist in communicating the reasons for the level of scientometric competence expressed in BCG&M's heated response to F&M's criticism

    A review of the literature on citation impact indicators

    Full text link
    Citation impact indicators nowadays play an important role in research evaluation, and consequently these indicators have received a lot of attention in the bibliometric and scientometric literature. This paper provides an in-depth review of the literature on citation impact indicators. First, an overview is given of the literature on bibliographic databases that can be used to calculate citation impact indicators (Web of Science, Scopus, and Google Scholar). Next, selected topics in the literature on citation impact indicators are reviewed in detail. The first topic is the selection of publications and citations to be included in the calculation of citation impact indicators. The second topic is the normalization of citation impact indicators, in particular normalization for field differences. Counting methods for dealing with co-authored publications are the third topic, and citation impact indicators for journals are the last topic. The paper concludes by offering some recommendations for future research

    A Review of Theory and Practice in Scientometrics

    Get PDF
    Scientometrics is the study of the quantitative aspects of the process of science as a communication system. It is centrally, but not only, concerned with the analysis of citations in the academic literature. In recent years it has come to play a major role in the measurement and evaluation of research performance. In this review we consider: the historical development of scientometrics, sources of citation data, citation metrics and the “laws" of scientometrics, normalisation, journal impact factors and other journal metrics, visualising and mapping science, evaluation and policy, and future developments

    Reviewing, indicating, and counting books for modern research evaluation systems

    Get PDF
    In this chapter, we focus on the specialists who have helped to improve the conditions for book assessments in research evaluation exercises, with empirically based data and insights supporting their greater integration. Our review highlights the research carried out by four types of expert communities, referred to as the monitors, the subject classifiers, the indexers and the indicator constructionists. Many challenges lie ahead for scholars affiliated with these communities, particularly the latter three. By acknowledging their unique, yet interrelated roles, we show where the greatest potential is for both quantitative and qualitative indicator advancements in book-inclusive evaluation systems.Comment: Forthcoming in Glanzel, W., Moed, H.F., Schmoch U., Thelwall, M. (2018). Springer Handbook of Science and Technology Indicators. Springer Some corrections made in subsection 'Publisher prestige or quality
    • 

    corecore