8,756 research outputs found

    Tracking citations and altmetrics for research data: Challenges and opportunities

    Get PDF
    Methods for determining research quality have long been debated but with little lasting agreement on standards, leading to the emergence of alternative metrics. Altmetrics are a useful supplement to traditional citation metrics, reflecting a variety of measurement points that give different perspectives on how a dataset is used and by whom. A positive development is the integration of a number of research datasets into the ISI Data Citation Index, making datasets searchable and linking them to published articles. Yet access to data resources and tracking the resulting altmetrics depend on specific qualities of the datasets and the systems where they are archived. Though research on altmetrics use is growing, the lack of standardization across datasets and system architecture undermines its generalizability. Without some standards, stakeholders' adoption of altmetrics will be limited

    Can We Count on Social Media Metrics? First Insights into the Active Scholarly Use of Social Media

    Full text link
    Measuring research impact is important for ranking publications in academic search engines and for research evaluation. Social media metrics or altmetrics measure the impact of scientific work based on social media activity. Altmetrics are complementary to traditional, citation-based metrics, e.g. allowing the assessment of new publications for which citations are not yet available. Despite the increasing importance of altmetrics, their characteristics are not well understood: Until now it has not been researched what kind of researchers are actively using which social media services and why - important questions for scientific impact prediction. Based on a survey among 3,430 scientists, we uncover previously unknown and significant differences between social media services: We identify services which attract young and experienced researchers, respectively, and detect differences in usage motivations. Our findings have direct implications for the future design of altmetrics for scientific impact prediction.Comment: Accepted at 10th ACM Conference on Web Science, Amsterda

    Genesis of Altmetrics or Article-level Metrics for Measuring Efficacy of Scholarly Communications: Current Perspectives

    Get PDF
    The article-level metrics (ALMs) or altmetrics becomes a new trendsetter in recent times for measuring the impact of scientific publications and their social outreach to intended audiences. The popular social networks such as Facebook, Twitter, and Linkedin and social bookmarks such as Mendeley and CiteULike are nowadays widely used for communicating research to larger transnational audiences. In 2012, the San Francisco Declaration on Research Assessment got signed by the scientific and researchers communities across the world. This declaration has given preference to the ALM or altmetrics over traditional but faulty journal impact factor (JIF)-based assessment of career scientists. JIF does not consider impact or influence beyond citations count as this count reflected only through Thomson Reuters' Web of Science database. Furthermore, JIF provides indicator related to the journal, but not related to a published paper. Thus, altmetrics now becomes an alternative metrics for performance assessment of individual scientists and their contributed scholarly publications. This paper provides a glimpse of genesis of altmetrics in measuring efficacy of scholarly communications and highlights available altmetric tools and social platforms linking altmetric tools, which are widely used in deriving altmetric scores of scholarly publications. The paper thus argues for institutions and policy makers to pay more attention to altmetrics based indicators for evaluation purpose but cautions that proper safeguards and validations are needed before their adoption

    Scholarly Metrics Baseline: A Survey of Faculty Knowledge, Use, and Opinion About Scholarly Metrics

    Get PDF
    This article presents the results of a faculty survey conducted at the University of Vermont during academic year 2014-2015. The survey asked faculty about: familiarity with scholarly metrics, metric seeking habits, help seeking habits, and the role of metrics in their department’s tenure and promotion process. The survey also gathered faculty opinions on how well scholarly metrics reflect the importance of scholarly work and how faculty feel about administrators gathering institutional scholarly metric information. Results point to the necessity of understanding the campus landscape of faculty knowledge, opinion, importance, and use of scholarly metrics before engaging faculty in further discussions about quantifying the impact of their scholarly work

    COVID-19 publications: Database coverage, citations, readers, tweets, news, Facebook walls, Reddit posts

    Get PDF
    © 2020 The Authors. Published by MIT Press. This is an open access article available under a Creative Commons licence. The published version can be accessed at the following link on the publisher’s website: https://doi.org/10.1162/qss_a_00066The COVID-19 pandemic requires a fast response from researchers to help address biological, medical and public health issues to minimize its impact. In this rapidly evolving context, scholars, professionals and the public may need to quickly identify important new studies. In response, this paper assesses the coverage of scholarly databases and impact indicators during 21 March to 18 April 2020. The rapidly increasing volume of research, is particularly accessible through Dimensions, and less through Scopus, the Web of Science, and PubMed. Google Scholar’s results included many false matches. A few COVID-19 papers from the 21,395 in Dimensions were already highly cited, with substantial news and social media attention. For this topic, in contrast to previous studies, there seems to be a high degree of convergence between articles shared in the social web and citation counts, at least in the short term. In particular, articles that are extensively tweeted on the day first indexed are likely to be highly read and relatively highly cited three weeks later. Researchers needing wide scope literature searches (rather than health focused PubMed or medRxiv searches) should start with Dimensions (or Google Scholar) and can use tweet and Mendeley reader counts as indicators of likely importance

    Exploring Features for Predicting Policy Citations

    Full text link
    In this study we performed an initial investigation and evaluation of altmetrics and their relationship with public policy citation of research papers. We examined methods for using altmetrics and other data to predict whether a research paper is cited in public policy and applied receiver operating characteristic curve on various feature groups in order to evaluate their potential usefulness. From the methods we tested, classifying based on tweet count provided the best results, achieving an area under the ROC curve of 0.91.Comment: 2 pages, accepted to JCDL '1

    Allegation of scientific misconduct increases Twitter attention

    Full text link
    The web-based microblogging system Twitter is a very popular altmetrics source for measuring the broader impact of science. In this case study, we demonstrate how problematic the use of Twitter data for research evaluation can be, even though the aspiration of measurement is degraded from impact to attention measurement. We collected the Twitter data for the paper published by Yamamizu et al. (2017). An investigative committee found that the main figures in the paper are fraudulent
    corecore