4,019 research outputs found

    Does the arXiv lead to higher citations and reduced publisher downloads for mathematics articles?

    Full text link
    An analysis of 2,765 articles published in four math journals from 1997 to 2005 indicate that articles deposited in the arXiv received 35% more citations on average than non-deposited articles (an advantage of about 1.1 citations per article), and that this difference was most pronounced for highly-cited articles. Open Access, Early View, and Quality Differential were examined as three non-exclusive postulates for explaining the citation advantage. There was little support for a universal Open Access explanation, and no empirical support for Early View. There was some inferential support for a Quality Differential brought about by more highly-citable articles being deposited in the arXiv. In spite of their citation advantage, arXiv-deposited articles received 23% fewer downloads from the publisher's website (about 10 fewer downloads per article) in all but the most recent two years after publication. The data suggest that arXiv and the publisher's website may be fulfilling distinct functional needs of the reader.Comment: Last updated May 02, 200

    Usage Bibliometrics

    Full text link
    Scholarly usage data provides unique opportunities to address the known shortcomings of citation analysis. However, the collection, processing and analysis of usage data remains an area of active research. This article provides a review of the state-of-the-art in usage-based informetric, i.e. the use of usage data to study the scholarly process.Comment: Publisher's PDF (by permission). Publisher web site: books.infotoday.com/asist/arist44.shtm

    The metric tide: report of the independent review of the role of metrics in research assessment and management

    Get PDF
    This report presents the findings and recommendations of the Independent Review of the Role of Metrics in Research Assessment and Management. The review was chaired by Professor James Wilsdon, supported by an independent and multidisciplinary group of experts in scientometrics, research funding, research policy, publishing, university management and administration. This review has gone beyond earlier studies to take a deeper look at potential uses and limitations of research metrics and indicators. It has explored the use of metrics across different disciplines, and assessed their potential contribution to the development of research excellence and impact. It has analysed their role in processes of research assessment, including the next cycle of the Research Excellence Framework (REF). It has considered the changing ways in which universities are using quantitative indicators in their management systems, and the growing power of league tables and rankings. And it has considered the negative or unintended effects of metrics on various aspects of research culture. The report starts by tracing the history of metrics in research management and assessment, in the UK and internationally. It looks at the applicability of metrics within different research cultures, compares the peer review system with metric-based alternatives, and considers what balance might be struck between the two. It charts the development of research management systems within institutions, and examines the effects of the growing use of quantitative indicators on different aspects of research culture, including performance management, equality, diversity, interdisciplinarity, and the ‘gaming’ of assessment systems. The review looks at how different funders are using quantitative indicators, and considers their potential role in research and innovation policy. Finally, it examines the role that metrics played in REF2014, and outlines scenarios for their contribution to future exercises

    The relationship between usage and citations in an open access mega-journal

    Get PDF
    Abstract: How do the level of usage of an article, the timeframe of its usage and its subject area relate to the number of citations it accrues? This paper aims to answer this question through an observational study of usage and citation data collected about the multidisciplinary, open access mega-journal Scientific Reports. This observational study answers these questions using the following methods: an overlap analysis of most read and top-cited articles; Spearman correlation tests between total citation counts over two years and usage over various timeframes; a comparison of first months of citation for most read and all articles; a Wilcoxon test on the distribution of total citations of early cited articles and the distribution of total citations of all other articles. All analyses were performed in using the programming language R. As Scientific Reports is a multidisciplinary journal covering all natural and clinical sciences, we also looked at the differences across subjects. We found a moderate correlation between usage in the first year and citations in the first two years since publication (Spearman correlation coefficient of 0.49, α = 0.05), and that articles with high usage in the first six months are more likely to have their first citation earlier (Wilcoxon = 1,811,500, p < 0.0001), which is also related to higher citations in the first two years (Wilcoxon = 8,071,200, p < 0.0001). As this final assertion is inferred based on the results of the other elements of this paper, it would require further analysis

    Web indicators for research evaluation. Part 1: Citations and links to academic articles from the Web

    Get PDF
    The extensive use of the web by many sectors of society has created the potential for new wider impact indicators. This article reviews research about Google Scholar and Google Patents, both of which can be used as sources of impact indicators for academic articles. It also briefly reviews methods to extract types of links and citations from the web as a whole, although the indicators that these generate are now probably too broad and too dominated by automatically generated websites, such as library and publisher catalogues, to be useful in practice. More valuable web-based indicators can be derived from specific types of web pages that cite academic research, such as online presentations, course syllabi, and science blogs. These provide evidence that is easier to understand and use and less likely to be affected by unwanted types of automatically generated content, although they are susceptible to gaming

    The Impact of the Activity of Industrial Engineering Researchers in Various Scientific-Citation Networks on Improving their Scientific Authority Status

    Get PDF
    This study analyzes the link between Mendeley indexes of scientific-citation networks and Scopus, taking into account the beneficial influence of researchers' actions in social networks on scientometric indices of works indexed in databases like Google scholar and WoS. In this basic/descriptive study, we use the Altmetrics approach to describe Iranian researchers’ activities in industrial engineering in scientific-citation networks. In this study, researchers whose activities are recorded with Iranian affiliation in scientific-citation networks have been briefly named Iranian researchers. The corpus of the study included the works of 160 Iranian researchers in the field of industrial engineering, indexed in the Scopus in the period 2000-2019. To test the likely correlation between the measures of social networks (SN) activities with scientometric ones, simple and multiple correlation tests were carried out by Excel and SPSS software. The correlation between the number of times a document was read, the number of citations, and the measures in the Mendeley, Scopus, We of Science (WoS), and Google Scholar (GS) was very high. However, the correlation between the number of readers in the Mendeley and co-authorship in Scopus was low. There was a strong correlation between the number of citations in Mendeley and that in other databases. The correlation between the authors' H-index in the Mendeley database and other databases is positive and significant, stronger in Scopus and WoS than Google Scholar. It was finally concluded that researchers’ activities in social networks attract more readers, increase the number of citations and thus increase the H-index score in databases. Therefore, they need to be more active in social networks to increase their H-index score and promote academic publications.https://dorl.net/dor/ 20.1001.1.20088302.2022.20.1.14.
    • 

    corecore