22 research outputs found

    Could scientists use Altmetric.com scores to predict longer term citation counts?

    Get PDF
    Altmetrics from Altmetric.com are widely used by publishers and researchers to give earlier evidence of attention than citation counts. This article assesses whether Altmetric.com scores are reliable early indicators of likely future impact and whether they may also reflect non-scholarly impacts. A preliminary factor analysis suggests that the main altmetric indicator of scholarly impact is Mendeley reader counts, with weaker news, informational and social network discussion/promotion dimensions in some fields. Based on a regression analysis of Altmetric.com data from November 2015 and Scopus citation counts from October 2017 for articles in 30 narrow fields, only Mendeley reader counts are consistent predictors of future citation impact. Most other Altmetric.com scores can help predict future impact in some fields. Overall, the results confirm that early Altmetric.com scores can predict later citation counts, although less well than journal impact factors, and the optimal strategy is to consider both Altmetric.com scores and journal impact factors. Altmetric.com scores can also reflect dimensions of non-scholarly impact in some fields

    What increases (social) media attention: Research impact, author prominence or title attractiveness?

    Get PDF
    Do only major scientific breakthroughs hit the news and social media, or does a 'catchy' title help to attract public attention? How strong is the connection between the importance of a scientific paper and the (social) media attention it receives? In this study we investigate these questions by analysing the relationship between the observed attention and certain characteristics of scientific papers from two major multidisciplinary journals: Nature Communication (NC) and Proceedings of the National Academy of Sciences (PNAS). We describe papers by features based on the linguistic properties of their titles and centrality measures of their authors in their co-authorship network. We identify linguistic features and collaboration patterns that might be indicators for future attention, and are characteristic to different journals, research disciplines, and media sources.Comment: Paper presented at 23rd International Conference on Science and Technology Indicators (STI 2018) in Leiden, The Netherland

    Analysis of the Altmetric top 100 Altmetric Attention Score Coronavirus publications

    Get PDF
    The emergence of the Covid-19 pandemic has led to the publication of many scientific papers. The goal of the present research was to analyze these papers using the Altmetric Attention Score (AAS). Statistics for 100 publications with high AAS scores were selected and exported from the Dimension database on May 22nd 2020. The major findings were that these publications were published in 34 different journals or preprint repositories. More than one-third of the total of 657, 350 social media posts were collected from the Twitter platform. The top contributing countries were China, followed by the USA. The paper “The proximal origin of SARS-CoV-2” by Andersen, Kristian G., etal., 2020 had the highest AAS (33 514). These findings may help others to design studies of the AAS in Coronavirus literature and compare them with traditional citations

    The pros and cons of the use of altmetrics in research assessment

    Get PDF
    © 2020 The Authors. Published by Levi Library Press. This is an open access article available under a Creative Commons licence. The published version can be accessed at the following link on the publisher’s website: http://doi.org/10.29024/sar.10Many indicators derived from the web have been proposed to supplement citation-based indicators in support of research assessments. These indicators, often called altmetrics, are available commercially from Altmetric.com and Elsevier’s Plum Analytics or can be collected directly. These organisations can also deliver altmetrics to support institutional selfevaluations. The potential advantages of altmetrics for research evaluation are that they may reflect important non-academic impacts and may appear before citations when an article is published, thus providing earlier impact evidence. Their disadvantages often include susceptibility to gaming, data sparsity, and difficulties translating the evidence into specific types of impact. Despite these limitations, altmetrics have been widely adopted by publishers, apparently to give authors, editors and readers insights into the level of interest in recently published articles. This article summarises evidence for and against extending the adoption of altmetrics to research evaluations. It argues that whilst systematicallygathered altmetrics are inappropriate for important formal research evaluations, they can play a role in some other contexts. They can be informative when evaluating research units that rarely produce journal articles, when seeking to identify evidence of novel types of impact during institutional or other self-evaluations, and when selected by individuals or groups to support narrative-based non-academic claims. In addition, Mendeley reader counts are uniquely valuable as early (mainly) scholarly impact indicators to replace citations when gaming is not possible and early impact evidence is needed. Organisations using alternative indicators need recruit or develop in-house expertise to ensure that they are not misused, however

    Influencing factors of Twitter mentions of scientific papers

    Full text link
    Purpose: This paper explores some influencing factors of Twitter mentions of scientific research. The results can help to understand the relationships between various altmetrics. Design/methodology/approach: Data on research mentions in Altmetric and a multiple linear regression analysis are used. Findings: Among the variables analyzed, the number of mainstream news is the factor that most influences the number of mentions on Twitter, followed by the fact of dealing with a highly topical issue such as COVID-19. The influence is weaker in the case of expert recommendations and the consolidation of knowledge in the form of a review. The lowest influence corresponds to both the public policies through references in reports, and to citations in Wikipedia, while mentions in patent applications does not have a significant influence. Research limitations: A specific field was studied in a specific time frame. Studying other fields and/or different time periods might result in different findings. Practical implications: Governments increasingly push researchers toward activities with societal impact and this study can help understand how different factors affect social media attention. Originality/value: Understanding social media attention of research is essential when implementing societal impact indicators.Comment: 13 pages, 5 table

    Unwrap Citation, Altmetric, and Mendeley Status of Highly Cited Articles in the Top-tier Library and Information Science Journals

    Get PDF
    Citation count is a quantitative method of measuring the impact of a research work. A higher citation count may indicate that the research work receives more attention among peers which could mean that the research contributes value to that discipline of literature. Citation count sums the number of times that an article is referenced by other authors. Tracking citations is important; however, the citation impact only tells a part of the story from academic researchers who conduct and publish research works. The impact of the publication on leisure readers and non-publishing readers are ignored. Furthermore, it is difficult to set a standard impact measurement across disciplines. Research showed that articles in the hard sciences (e.g. chemistry, biology) tends to gain more citations than in soft sciences (e.g. social science, psychology) (Harzing, 2010; Nederhof, 2006). Even in the same field, articles that focus on praxis often receive less citation count than those that focus on theories. However, articles that focus on practice are valuable, and should be a part of the academic landscape (Akers, 2017). Finally, measuring the value of a newly published article with citation count can be difficult, since citations grow gradually over the years. The emergence of electronic publications and web technology allows people to view a research output by the amount of attention it receives. Web-based tools such as F1000, PLos, Altmetric, Plum Analytics, CiteULike, and Mendeley collect a publication’s output through a variety of online sources. These usage statistics such as number of views, downloads, mentions, etc., disclose the popularity or influence of a publication to some degree (Zahedi, Costas, & Wouters, 2014). Mendeley readership — a feature of Mendeley Web powered by Scopus —allows researchers to monitor the impact as well as the usage of their scholarly work (Bonasio, 2014). Altmetric attention score (AAS) generates a research impact score by weighting the attention that an article receives from social media, blogs, news, and other online sources. AAS presents a quick, UNWRAP CITATIONS, ALTMETRIC SCORE, AND MENDELEY READERSHIP STATUS 3 multifaceted way to demonstrate the value of a research that is arguable more robust than citation count (Huang, Wang, & Wu, 2018). Since works in the arts and humanities typically do not receive as many citations as other disciplines, the traditional bibliometric may not be a good indicator of research impact — AAS is more considerable in fields that measure researcher and reader behaviors like searching, reading, and sharing (Cho, 2017). As an increasing amount scholars and researchers in academic disciplines create their online research profile on academic network (e.g. Academia, ResearchGate, Linkedln, Mendeley) or share their research via social media, the online attention has become a valuable aspect and a non-delay algorism to measure research impacts (Aharony et al., 2019; Garcovich, Ausina Marquez, & Adobes Martin, 2019)

    Mendeley reader counts for US computer science conference papers and journal articles

    Get PDF
    © 2020 The Authors. Published by MIT Press. This is an open access article available under a Creative Commons licence. The published version can be accessed at the following link on the publisher’s website: https://direct.mit.edu/qss/article/1/1/347/15566/Mendeley-reader-counts-for-US-computer-scienceAlthough bibliometrics are normally applied to journal articles when used to support research evaluations, conference papers are at least as important in fast-moving computingrelated fields. It is therefore important to assess the relative advantages of citations and altmetrics for computing conference papers to make an informed decision about which, if any, to use. This paper compares Scopus citations with Mendeley reader counts for conference papers and journal articles that were published between 1996 and 2018 in 11 computing fields and had at least one US author. The data showed high correlations between Scopus citation counts and Mendeley reader counts in all fields and most years, but with few Mendeley readers for older conference papers and few Scopus citations for new conference papers and journal articles. The results therefore suggest that Mendeley reader counts have a substantial advantage over citation counts for recently-published conference papers due to their greater speed, but are unsuitable for older conference papers

    Could early tweet counts predict later citation counts? A gender study in Life Sciences and Biomedicine (2014–2016)

    Get PDF
    In this study, it was investigated whether early tweets counts could differentially benefit female and male (first, last) authors in terms of the later citation counts received. The data for this study comprised 47,961 articles in the research area of Life Sciences & Biomedicine from 2014–2016, retrieved from Web of Science’s\ua0Medline. For each article, the number of received citations per year was downloaded from WOS, while the number of received tweets per year was obtained from PlumX. Using the hurdle regression model, I compared the number of received citations by female and male (first, last) authored papers and then I investigated whether early tweet counts could predict the later citation counts received by female and male (first, last) authored papers. In the regression models, I controlled for several important factors that were investigated in previous research in relation to citation counts, gender or Altmetrics. These included journal impact (SNIP), number of authors, open access, research funding, topic of an article, international collaboration, lay summary, F1000 Score and mega journal. The findings showed that the percentage of papers with male authors in first or last authorship positions was higher than that for female authors. However, female first and last-authored papers had a small but significant citation advantage of 4.7% and 5.5% compared to male-authored papers. The findings also showed that irrespective of whether the factors were included in regression models or not, early tweet counts had a weak positive and significant association with the later citations counts (3.3%) and the probability of a paper being cited (21.1%). Regarding gender, the findings showed that when all variables were controlled, female (first, last) authored papers had a small citation advantage of 3.7% and 4.2% in comparison to the male authored papers for the same number of tweets

    Information and Scientific Impact of Advanced Therapies in the Age of Mass Media: Altmetrics-Based Analysis of Tissue Engineering

    Get PDF
    This study was supported by CTS-115 (Tissue Engineering Research Group, University of Granada) from Junta de Andalucia, Spain, the Spanish State Research Agency through the project PID2019-105381GA-I00/AEI/10.13039/501100011033 (iScience) , a postdoctoral grant (RH-0145-2020) from the Andalusia Health System, and the European Union Fondo Europeo de Desarrollo Regional para la Inversion Territorial Integrada Grant for Cadiz Province (PI-0032-2017) . The authors thank Altmetric LLP (London, UK) for granting access to Altmetric Explorer for research purposes.Background: Tissue engineering (TE) constitutes a multidisciplinary field aiming to construct artificial tissues to regenerate end-stage organs. Its development has taken place since the last decade of the 20th century, entailing a clinical revolution. TE research groups have worked and shared relevant information in the mass media era. Thus, it would be interesting to study the online dimension of TE research and to compare it with traditional measures of scientific impact. Objective: The objective of this study was to evaluate the online dimension of TE documents from 2012 to 2018 using metadata obtained from the Web of Science (WoS) and Altmetric and to develop a prediction equation for the impact of TE documents from altmetric scores. Methods: We analyzed 10,112 TE documents through descriptive and statistical methods. First, the TE temporal evolution was exposed for WoS and 15 online platforms (news, blogs, policy, Twitter, patents, peer review, Weibo, Facebook, Wikipedia, Google, Reddit, F1000, Q&A, video, and Mendeley Readers). The 10 most cited TE original articles were ranked according to the normalized WoS citations and the normalized Altmetric Attention Score. Second, to better comprehend the TE online framework, correlation and factor analyses were performed based on the suitable results previously obtained for the Bartlett sphericity and Kaiser–Meyer–Olkin tests. Finally, the linear regression model was applied to elucidate the relation between academics and online media and to construct a prediction equation for TE from altmetrics data. Results: TE dynamic shows an upward trend in WoS citations, Twitter, Mendeley Readers, and Altmetric Scores. However, WoS and Altmetric rankings for the most cited documents clearly differ. When compared, the best correlation results were obtained for Mendeley Readers and WoS (ρ=0.71). In addition, the factor analysis identified 6 factors that could explain the previously observed differences between academic institutions and the online platforms evaluated. At this point, the mathematical model constructed is able to predict and explain more than 40% of TE WoS citations from Altmetric scores. Conclusions: Scientific information related to the construction of bioartificial tissues increasingly reaches society through different online media. Because the focus of TE research importantly differs when the academic institutions and online platforms are compared, basic and clinical research groups, academic institutions, and health politicians should make a coordinated effort toward the design and implementation of adequate strategies for information diffusion and population health education.CTS-115 (Tissue Engineering Research Group, University of Granada) from Junta de Andalucia, SpainSpanish Government PID2019-105381GA-I00/AEI/10.13039/501100011033Andalusia Health System RH-0145-2020European Union Fondo Europeo de Desarrollo Regional para la Inversion Territorial Integrada Grant for Cadiz Province PI-0032-201

    Does society show differential attention to researchers based on gender and field?

    Full text link
    While not all researchers prioritize social impact, it is undeniably a crucial aspect that adds significance to their work. The objective of this paper is to explore potential gender differences in the social attention paid to researchers and to examine their association with specific fields of study. To achieve this goal, the paper analyzes four dimensions of social influence and examines three measures of social attention to researchers. The dimensions are media influence (mentions in mainstream news), political influence (mentions in public policy reports), social media influence (mentions in Twitter), and educational influence (mentions in Wikipedia). The measures of social attention to researchers are: proportion of publications with social mentions (social attention orientation), mentions per publication (level of social attention), and mentions per mentioned publication (intensity of social attention). By analyzing the rankings of authors -- for the four dimensions with the three measures in the 22 research fields of the Web of Science database -- and by using Spearman correlation coefficients, we conclude that: 1) significant differences are observed between fields; 2) the dimensions capture different and independent aspects of the social impact. Finally, we use non-parametric means comparison tests to detect gender bias in social attention. We conclude that for most fields and dimensions with enough non-zero altmetrics data, gender differences in social attention are not predominant, but are still present and vary across fields.Comment: 23 pages, 5 figures, 7 table
    corecore