33,294 research outputs found
Measuring Social Media Activity of Scientific Literature: An Exhaustive Comparison of Scopus and Novel Altmetrics Big Data
This paper measures social media activity of 15 broad scientific disciplines
indexed in Scopus database using Altmetric.com data. First, the presence of
Altmetric.com data in Scopus database is investigated, overall and across
disciplines. Second, the correlation between the bibliometric and altmetric
indices is examined using Spearman correlation. Third, a zero-truncated
negative binomial model is used to determine the association of various factors
with increasing or decreasing citations. Lastly, the effectiveness of altmetric
indices to identify publications with high citation impact is comprehensively
evaluated by deploying Area Under the Curve (AUC) - an application of receiver
operating characteristic. Results indicate a rapid increase in the presence of
Altmetric.com data in Scopus database from 10.19% in 2011 to 20.46% in 2015. A
zero-truncated negative binomial model is implemented to measure the extent to
which different bibliometric and altmetric factors contribute to citation
counts. Blog count appears to be the most important factor increasing the
number of citations by 38.6% in the field of Health Professions and Nursing,
followed by Twitter count increasing the number of citations by 8% in the field
of Physics and Astronomy. Interestingly, both Blog count and Twitter count
always show positive increase in the number of citations across all fields.
While there was a positive weak correlation between bibliometric and altmetric
indices, the results show that altmetric indices can be a good indicator to
discriminate highly cited publications, with an encouragingly AUC= 0.725
between highly cited publications and total altmetric count. Overall, findings
suggest that altmetrics could better distinguish highly cited publications.Comment: 34 Pages, 3 Figures, 15 Table
Tweeting biomedicine: an analysis of tweets and citations in the biomedical literature
Data collected by social media platforms have recently been introduced as a
new source for indicators to help measure the impact of scholarly research in
ways that are complementary to traditional citation-based indicators. Data
generated from social media activities related to scholarly content can be used
to reflect broad types of impact. This paper aims to provide systematic
evidence regarding how often Twitter is used to diffuse journal articles in the
biomedical and life sciences. The analysis is based on a set of 1.4 million
documents covered by both PubMed and Web of Science (WoS) and published between
2010 and 2012. The number of tweets containing links to these documents was
analyzed to evaluate the degree to which certain journals, disciplines, and
specialties were represented on Twitter. It is shown that, with less than 10%
of PubMed articles mentioned on Twitter, its uptake is low in general. The
relationship between tweets and WoS citations was examined for each document at
the level of journals and specialties. The results show that tweeting behavior
varies between journals and specialties and correlations between tweets and
citations are low, implying that impact metrics based on tweets are different
from those based on citations. A framework utilizing the coverage of articles
and the correlation between Twitter mentions and citations is proposed to
facilitate the evaluation of novel social-media based metrics and to shed light
on the question in how far the number of tweets is a valid metric to measure
research impact.Comment: 22 pages, 4 figures, 5 table
Tweets as impact indicators: Examining the implications of automated bot accounts on Twitter
This brief communication presents preliminary findings on automated Twitter
accounts distributing links to scientific papers deposited on the preprint
repository arXiv. It discusses the implication of the presence of such bots
from the perspective of social media metrics (altmetrics), where mentions of
scholarly documents on Twitter have been suggested as a means of measuring
impact that is both broader and timelier than citations. We present preliminary
findings that automated Twitter accounts create a considerable amount of tweets
to scientific papers and that they behave differently than common social bots,
which has critical implications for the use of raw tweet counts in research
evaluation and assessment. We discuss some definitions of Twitter cyborgs and
bots in scholarly communication and propose differentiating between different
levels of engagement from tweeting only bibliographic information to discussing
or commenting on the content of a paper.Comment: 9 pages, 4 figures, 1 tabl
How the Scientific Community Reacts to Newly Submitted Preprints: Article Downloads, Twitter Mentions, and Citations
We analyze the online response to the preprint publication of a cohort of
4,606 scientific articles submitted to the preprint database arXiv.org between
October 2010 and May 2011. We study three forms of responses to these
preprints: downloads on the arXiv.org site, mentions on the social media site
Twitter, and early citations in the scholarly record. We perform two analyses.
First, we analyze the delay and time span of article downloads and Twitter
mentions following submission, to understand the temporal configuration of
these reactions and whether one precedes or follows the other. Second, we run
regression and correlation tests to investigate the relationship between
Twitter mentions, arXiv downloads and article citations. We find that Twitter
mentions and arXiv downloads of scholarly articles follow two distinct temporal
patterns of activity, with Twitter mentions having shorter delays and narrower
time spans than arXiv downloads. We also find that the volume of Twitter
mentions is statistically correlated with arXiv downloads and early citations
just months after the publication of a preprint, with a possible bias that
favors highly mentioned articles.Comment: 15 pages, 7 Figures, 3 Tables. PLoS One, in pres
The pros and cons of the use of altmetrics in research assessment
© 2020 The Authors. Published by Levi Library Press. This is an open access article available under a Creative Commons licence.
The published version can be accessed at the following link on the publisher’s website: http://doi.org/10.29024/sar.10Many indicators derived from the web have been proposed to supplement citation-based
indicators in support of research assessments. These indicators, often called altmetrics, are
available commercially from Altmetric.com and Elsevier’s Plum Analytics or can be collected
directly. These organisations can also deliver altmetrics to support institutional selfevaluations. The potential advantages of altmetrics for research evaluation are that they
may reflect important non-academic impacts and may appear before citations when an
article is published, thus providing earlier impact evidence. Their disadvantages often
include susceptibility to gaming, data sparsity, and difficulties translating the evidence into
specific types of impact. Despite these limitations, altmetrics have been widely adopted by
publishers, apparently to give authors, editors and readers insights into the level of interest
in recently published articles. This article summarises evidence for and against extending
the adoption of altmetrics to research evaluations. It argues that whilst systematicallygathered altmetrics are inappropriate for important formal research evaluations, they can
play a role in some other contexts. They can be informative when evaluating research units
that rarely produce journal articles, when seeking to identify evidence of novel types of
impact during institutional or other self-evaluations, and when selected by individuals or
groups to support narrative-based non-academic claims. In addition, Mendeley reader
counts are uniquely valuable as early (mainly) scholarly impact indicators to replace
citations when gaming is not possible and early impact evidence is needed. Organisations
using alternative indicators need recruit or develop in-house expertise to ensure that they
are not misused, however
- …