Skip to main content
Article thumbnail
Location of Repository

Ranking of library and information science researchers: Comparison of data sources for correlating citation data, and expert judgments

By J.A. Li, M. Sanderson, P. Willett, M. Norris and C. Oppenheim

Abstract

This paper studies the correlations between peer review and citation indicators when evaluating research quality in library and information science (LIS). Forty-two LIS experts provided judgments on a 5-point scale of the quality of research published by 101 scholars; the median rankings resulting from these judgments were then correlated with h-, g- and H-index values computed using three different sources of citation data: Web of Science (WoS), Scopus and Google Scholar (GS). The two variants of the basic h-index correlated more strongly with peer judgment than did the h-index itself; citation data from Scopus was more strongly correlated with the expert judgments than was data from GS, which in turn was more strongly correlated than data from WoS; correlations from a carefully cleaned version of GS data were little different from those obtained using swiftly gathered GS data; the indices from the citation databases resulted in broadly similar rankings of the LIS academics; GS disadvantaged researchers in bibliometrics compared to the other two citation database while WoS disadvantaged researchers in the more technical aspects of information retrieval; and experts from the UK and other European countries rated UK academics with higher scores than did experts from the USA. (C) 2010 Elsevier Ltd. All rights reserved

Publisher: Elsevier Science
Year: 2010
OAI identifier: oai:eprints.whiterose.ac.uk:11244

Suggested articles


To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.