214,583 research outputs found

    F1000 recommendations as a new data source for research evaluation: A comparison with citations

    Get PDF
    F1000 is a post-publication peer review service for biological and medical research. F1000 aims to recommend important publications in the biomedical literature, and from this perspective F1000 could be an interesting tool for research evaluation. By linking the complete database of F1000 recommendations to the Web of Science bibliographic database, we are able to make a comprehensive comparison between F1000 recommendations and citations. We find that about 2% of the publications in the biomedical literature receive at least one F1000 recommendation. Recommended publications on average receive 1.30 recommendations, and over 90% of the recommendations are given within half a year after a publication has appeared. There turns out to be a clear correlation between F1000 recommendations and citations. However, the correlation is relatively weak, at least weaker than the correlation between journal impact and citations. More research is needed to identify the main reasons for differences between recommendations and citations in assessing the impact of publications

    Do altmetrics correlate with citations? Extensive comparison of altmetric indicators with citations from a multidisciplinary perspective

    Get PDF
    An extensive analysis of the presence of different altmetric indicators provided by Altmetric.com across scientific fields is presented, particularly focusing on their relationship with citations. Our results confirm that the presence and density of social media altmetric counts are still very low and not very frequent among scientific publications, with 15%-24% of the publications presenting some altmetric activity and concentrating in the most recent publications, although their presence is increasing over time. Publications from the social sciences, humanities and the medical and life sciences show the highest presence of altmetrics, indicating their potential value and interest for these fields. The analysis of the relationships between altmetrics and citations confirms previous claims of positive correlations but relatively weak, thus supporting the idea that altmetrics do not reflect the same concept of impact as citations. Also, altmetric counts do not always present a better filtering of highly cited publications than journal citation scores. Altmetrics scores (particularly mentions in blogs) are able to identify highly cited publications with higher levels of precision than journal citation scores (JCS), but they have a lower level of recall. The value of altmetrics as a complementary tool of citation analysis is highlighted, although more research is suggested to disentangle the potential meaning and value of altmetric indicators for research evaluation

    Within-Journal Demonstrations of the Open-Access Impact Advantage: PLoS, Pipe-Dreams and Peccadillos (LETTER)

    No full text
    Eysenbach's (2006) study in PloS Biology on 1492 articles published during one 6-month period in one journal (PNAS) found that the Open Access (OA) articles were more cited than the non-OA ones. The online bibliography on the OA citation advantage http://opcit.eprints.org/oacitation-biblio.html records a number of prior within-journal comparisons that found exactly the same effect: freely available articles are read and cited more. Eysenbach’s further finding that the OA advantage (in this particular 6-month, 3-option, 1-journal PloS/PNAS study) is greater for articles that have paid for OA publication than for those that have merely been self-archived will require replication on much larger samples as most of the prior evidence for the OA advantage comes from self-archived articles and is based on sample sizes four orders of magnitude larger for both the number of articles and the number of journals tested

    The NASA Astrophysics Data System: Overview

    Get PDF
    The NASA Astrophysics Data System Abstract Service has become a key component of astronomical research. It provides bibliographic information daily, or near daily, to a majority of astronomical researchers worldwide. We describe the history of the development of the system and its current status. We show several examples of how to use the ADS, and we show how ADS use has increased as a function of time. Currently it is still increasing exponentially, with a doubling time for number of queries of 17 months. Using the ADS logs we make the first detailed model of how scientific journals are read as a function of time since publication. The impact of the ADS on astronomy can be calculated after making some simple assumptions. We find that the ADS increases the efficiency of astronomical research by 333 Full Time Equivalent (2000 hour) research years per year, and that the value of the early development of the ADS for astronomy, compared with waiting for mature technologies to be adopted, is 2332 FTE research years. The ADS is available at http://adswww.harvard.edu/.Comment: 19 pages, 22 figure

    Visualization of Publication Impact

    Full text link
    Measuring scholarly impact has been a topic of much interest in recent years. While many use the citation count as a primary indicator of a publications impact, the quality and impact of those citations will vary. Additionally, it is often difficult to see where a paper sits among other papers in the same research area. Questions we wished to answer through this visualization were: is a publication cited less than publications in the field?; is a publication cited by high or low impact publications?; and can we visually compare the impact of publications across a result set? In this work we address the above questions through a new visualization of publication impact. Our technique has been applied to the visualization of citation information in INSPIREHEP (http://www.inspirehep.net), the largest high energy physics publication repository
    • …
    corecore