216 research outputs found

    Do altmetrics correlate with the quality of papers? A large-scale empirical study based on F1000Prime data

    Full text link
    In this study, we address the question whether (and to what extent, respectively) altmetrics are related to the scientific quality of papers (as measured by peer assessments). Only a few studies have previously investigated the relationship between altmetrics and assessments by peers. In the first step, we analyse the underlying dimensions of measurement for traditional metrics (citation counts) and altmetrics - by using principal component analysis (PCA) and factor analysis (FA). In the second step, we test the relationship between the dimensions and quality of papers (as measured by the post-publication peer-review system of F1000Prime assessments) - using regression analysis. The results of the PCA and FA show that altmetrics operate along different dimensions, whereas Mendeley counts are related to citation counts, and tweets form a separate dimension. The results of the regression analysis indicate that citation-based metrics and readership counts are significantly more related to quality, than tweets. This result on the one hand questions the use of Twitter counts for research evaluation purposes and on the other hand indicates potential use of Mendeley reader counts

    COVID-19 publications: Database coverage, citations, readers, tweets, news, Facebook walls, Reddit posts

    Get PDF
    © 2020 The Authors. Published by MIT Press. This is an open access article available under a Creative Commons licence. The published version can be accessed at the following link on the publisher’s website: https://doi.org/10.1162/qss_a_00066The COVID-19 pandemic requires a fast response from researchers to help address biological, medical and public health issues to minimize its impact. In this rapidly evolving context, scholars, professionals and the public may need to quickly identify important new studies. In response, this paper assesses the coverage of scholarly databases and impact indicators during 21 March to 18 April 2020. The rapidly increasing volume of research, is particularly accessible through Dimensions, and less through Scopus, the Web of Science, and PubMed. Google Scholar’s results included many false matches. A few COVID-19 papers from the 21,395 in Dimensions were already highly cited, with substantial news and social media attention. For this topic, in contrast to previous studies, there seems to be a high degree of convergence between articles shared in the social web and citation counts, at least in the short term. In particular, articles that are extensively tweeted on the day first indexed are likely to be highly read and relatively highly cited three weeks later. Researchers needing wide scope literature searches (rather than health focused PubMed or medRxiv searches) should start with Dimensions (or Google Scholar) and can use tweet and Mendeley reader counts as indicators of likely importance

    Does the public discuss other topics on climate change than researchers? A comparison of explorative networks based on author keywords and hashtags

    Full text link
    Twitter accounts have already been used in many scientometric studies, but the meaningfulness of the data for societal impact measurements in research evaluation has been questioned. Earlier research focused on social media counts and neglected the interactive nature of the data. We explore a new network approach based on Twitter data in which we compare author keywords to hashtags as indicators of topics. We analyze the topics of tweeted publications and compare them with the topics of all publications (tweeted and not tweeted). Our exploratory study is based on a comprehensive publication set of climate change research. We are interested in whether Twitter data are able to reveal topics of public discussions which can be separated from research-focused topics. We find that the most tweeted topics regarding climate change research focus on the consequences of climate change for humans. Twitter users are interested in climate change publications which forecast effects of a changing climate on the environment and to adaptation, mitigation and management issues rather than in the methodology of climate-change research and causes of climate change. Our results indicate that publications using scientific jargon are less likely to be tweeted than publications using more general keywords. Twitter networks seem to be able to visualize public discussions about specific topics.Comment: 31 pages, 1 table, and 7 figure

    Social media metrics for new research evaluation

    Get PDF
    This chapter approaches, both from a theoretical and practical perspective, the most important principles and conceptual frameworks that can be considered in the application of social media metrics for scientific evaluation. We propose conceptually valid uses for social media metrics in research evaluation. The chapter discusses frameworks and uses of these metrics as well as principles and recommendations for the consideration and application of current (and potentially new) metrics in research evaluation.Comment: Forthcoming in Glanzel, W., Moed, H.F., Schmoch U., Thelwall, M. (2018). Springer Handbook of Science and Technology Indicators. Springe

    A multidimensional analysis of Aslib proceedings – using everything but the impact factor

    Get PDF
    Purpose – The purpose of this paper is to show that the journal impact factor (IF) is not able to reflect the full impact of scholarly journals and provides an overview of alternative and complementary methods in journal evaluation. Design/methodology/approach – Aslib Proceedings (AP) is exemplarily analyzed with a set of indicators from five dimensions of journal evaluation, i.e. journal output, content, perception and usage, citations and management to accurately reflect its various strengths and weaknesses beyond the IF. Findings – AP has become more international in terms of authors and more diverse regarding its topics. Citation impact is generally low and, with the exception of a special issue on blogs, remains world average. However, an evaluation of downloads and Mendeley readers reveals that the journal is an important source of information for professionals and students and certain topics are frequently read but not cited. Research limitations/implications – The study is limited to one journal. Practical implications – An overview of various indicators and methods is provided that can be applied in the quantitative evaluation of scholarly journals (and also to articles, authors and institutions). Originality/value – After a publication history of more than 60 years, this analysis takes stock of AP, highlighting strengths and weaknesses and developments over time. The case study provides an example and overview of the possibilities of multidimensional journal evaluation

    Altmetrics and societal impact measurements: Match or mismatch? A literature review

    Get PDF
    Can alternative metrics (altmetrics) data be used to measure societal impact? We wrote this literature overview of empirical studies in order to find an answer to this question. The overview includes two parts. The first part, “societal impact measurements”, explains possible methods and problems in measuring the societal impact of research, case studies for societal impact measurement, societal impact considerations at funding organizations, and the societal problems that should be solved by science. The second part of the review, “altmetrics”, addresses a major question in research evaluation, which is whether altmetrics are proper indicators for measuring the societal impact of research. In the second part we explain the data sources used for altmetrics studies and the importance of field-normalized indicators for impact measurements. This review indicates that it should be relevant for impact measurements to be oriented towards pressing societal problems. Case studies in which societal impact of certain pieces of research is explained seem to provide a legitimate method for measuring societal impact. In the use of altmetrics, field-specific differences should be considered by applying field normalization (in cross-field comparisons). Altmetrics data such as social media counts might mainly reflect the public interest and discussion of scholarly works rather than their societal impact. Altmetrics (Twitter data) might be especially fruitfully employed for research evaluation purposes, if they are used in the context of network approaches. Conclusions based on altmetrics data in research evaluation should be drawn with caution
    • …
    corecore