4,394 research outputs found

    The pros and cons of the use of altmetrics in research assessment

    Get PDF
    © 2020 The Authors. Published by Levi Library Press. This is an open access article available under a Creative Commons licence. The published version can be accessed at the following link on the publisher’s website: http://doi.org/10.29024/sar.10Many indicators derived from the web have been proposed to supplement citation-based indicators in support of research assessments. These indicators, often called altmetrics, are available commercially from Altmetric.com and Elsevier’s Plum Analytics or can be collected directly. These organisations can also deliver altmetrics to support institutional selfevaluations. The potential advantages of altmetrics for research evaluation are that they may reflect important non-academic impacts and may appear before citations when an article is published, thus providing earlier impact evidence. Their disadvantages often include susceptibility to gaming, data sparsity, and difficulties translating the evidence into specific types of impact. Despite these limitations, altmetrics have been widely adopted by publishers, apparently to give authors, editors and readers insights into the level of interest in recently published articles. This article summarises evidence for and against extending the adoption of altmetrics to research evaluations. It argues that whilst systematicallygathered altmetrics are inappropriate for important formal research evaluations, they can play a role in some other contexts. They can be informative when evaluating research units that rarely produce journal articles, when seeking to identify evidence of novel types of impact during institutional or other self-evaluations, and when selected by individuals or groups to support narrative-based non-academic claims. In addition, Mendeley reader counts are uniquely valuable as early (mainly) scholarly impact indicators to replace citations when gaming is not possible and early impact evidence is needed. Organisations using alternative indicators need recruit or develop in-house expertise to ensure that they are not misused, however

    Assessing the teaching value of non-English academic books: The case of Spain

    Get PDF
    This study examines the educational value of 15,117 Spanish-language books published by Spanish publishers in social sciences and humanities fields in the period 2002-2011, based on mentions of them extracted automatically from online course syllabi. A method was developed to collect syllabus mentions and filter out false matches. Manual checks of the 52,716 syllabus mentions found estimated an accuracy of 99.5% for filtering out false mentions and 74.7% for identifying correct mentions. A fifth of the sampled books (2,849; 19%) were mentioned at least once in online syllabi and almost all (95%) were from a third of the publishers included in the study. An in-depth analysis of the 23 books recommended most often in online syllabi showed that they are mostly single-authored humanities monographs that were originally written in Spanish. The syllabus mentions originated from 379 domains, but mostly from Spanish university websites. In conclusion, it is possible to make indicators from online syllabus mentions to assess the teaching value of Spanish-language books, although manual checks are needed if the values ​​are used for assessing individual books

    Do prestigious Spanish scholarly book publishers have more teaching impact?

    Get PDF
    Purpose The purpose of this paper is to assess the educational value of prestigious and productive Spanish scholarly publishers based on mentions of their books in online scholarly syllabi. Design/methodology/approach Syllabus mentions of 15,117 books from 27 publishers were searched for, manually checked and compared with Microsoft Academic (MA) citations. Findings Most books published by Ariel, Síntesis, Tecnos and Cátedra have been mentioned in at least one online syllabus, indicating that their books have consistently high educational value. In contrast, few books published by the most productive publishers were mentioned in online syllabi. Prestigious publishers have both the highest educational impact based on syllabus mentions and the highest research impact based on MA citations. Research limitations/implications The results might be different for other publishers. The online syllabus mentions found may be a small fraction of the syllabus mentions of the sampled books. Practical implications Authors of Spanish-language social sciences and humanities books should consider general prestige when selecting a publisher if they want educational uptake for their work. Originality/value This is the first study assessing book publishers based on syllabus mentions

    Do prestigious Spanish scholarly book publishers have more teaching impact?

    Get PDF
    Purpose The purpose of this paper is to assess the educational value of prestigious and productive Spanish scholarly publishers based on mentions of their books in online scholarly syllabi. Design/methodology/approach Syllabus mentions of 15,117 books from 27 publishers were searched for, manually checked and compared with Microsoft Academic (MA) citations. Findings Most books published by Ariel, Síntesis, Tecnos and Cátedra have been mentioned in at least one online syllabus, indicating that their books have consistently high educational value. In contrast, few books published by the most productive publishers were mentioned in online syllabi. Prestigious publishers have both the highest educational impact based on syllabus mentions and the highest research impact based on MA citations. Research limitations/implications The results might be different for other publishers. The online syllabus mentions found may be a small fraction of the syllabus mentions of the sampled books. Practical implications Authors of Spanish-language social sciences and humanities books should consider general prestige when selecting a publisher if they want educational uptake for their work. Originality/value This is the first study assessing book publishers based on syllabus mentions

    Can alternative indicators overcome language biases in citation counts? A comparison of Spanish and UK research

    Get PDF
    This is an accepted manuscript of an article published by Springer in Scientometrics on 09/09/2016, available online: https://doi.org/10.1007/s11192-016-2118-8 The accepted version of the publication may differ from the final published version.This study compares Spanish and UK research in eight subject fields using a range of bibliometric and social media indicators. For each field, lists of Spanish and UK journal articles published in the year 2012 and their citation counts were extracted from Scopus. The software Webometric Analyst was then used to extract a range of altmetrics for these articles, including patent citations, online presentation mentions, online course syllabus mentions, Wikipedia mentions and Mendeley reader counts and Altmetric.com was used to extract Twitter mentions. Results show that Mendeley is the altmetric source with the highest coverage, with 80% of sampled articles having one or more Mendeley readers, followed by Twitter (34%). The coverage of the remaining sources was lower than 3%. All of the indicators checked either have too little data or increase the overall difference between Spain and the UK and so none can be suggested as alternatives to reduce the bias against Spain in traditional citation indexes

    Web indicators for research evaluation. Part 1: Citations and links to academic articles from the Web

    Get PDF
    The extensive use of the web by many sectors of society has created the potential for new wider impact indicators. This article reviews research about Google Scholar and Google Patents, both of which can be used as sources of impact indicators for academic articles. It also briefly reviews methods to extract types of links and citations from the web as a whole, although the indicators that these generate are now probably too broad and too dominated by automatically generated websites, such as library and publisher catalogues, to be useful in practice. More valuable web-based indicators can be derived from specific types of web pages that cite academic research, such as online presentations, course syllabi, and science blogs. These provide evidence that is easier to understand and use and less likely to be affected by unwanted types of automatically generated content, although they are susceptible to gaming

    Patent citation analysis with Google

    Get PDF
    This is an accepted manuscript of an article published by Wiley-Blackwell in Journal of the Association for Information Science and Technology on 23/09/2015, available online: https://doi.org/10.1002/asi.23608 The accepted version of the publication may differ from the final published version.Citations from patents to scientific publications provide useful evidence about the commercial impact of academic research, but automatically searchable databases are needed to exploit this connection for large-scale patent citation evaluations. Google covers multiple different international patent office databases but does not index patent citations or allow automatic searches. In response, this article introduces a semiautomatic indirect method via Bing to extract and filter patent citations from Google to academic papers with an overall precision of 98%. The method was evaluated with 322,192 science and engineering Scopus articles from every second year for the period 1996–2012. Although manual Google Patent searches give more results, especially for articles with many patent citations, the difference is not large enough to be a major problem. Within Biomedical Engineering, Biotechnology, and Pharmacology & Pharmaceutics, 7% to 10% of Scopus articles had at least one patent citation but other fields had far fewer, so patent citation analysis is only relevant for a minority of publications. Low but positive correlations between Google Patent citations and Scopus citations across all fields suggest that traditional citation counts cannot substitute for patent citations when evaluating research

    Web indicators for research evaluation. Part 3: books and non standard outputs

    Get PDF
    This literature review describes web indicators for the impact of books, software, datasets, videos and other non-standard academic outputs. Although journal articles dominate academic research in the health and natural sciences, other types of outputs can make equally valuable contributions to scholarship and are more common in other fields. It is not always possible to get useful citation-based impact indicators for these due to their absence from, or incomplete coverage in, traditional citation indexes. In this context, the web is particularly valuable as a potential source of impact indicators for non-standard academic outputs. The main focus in this review is on books because of the much greater amount of relevant research for them and because they are regarded as particularly valuable in the arts and humanities and in some areas of the social sciences

    Reviewing, indicating, and counting books for modern research evaluation systems

    Get PDF
    In this chapter, we focus on the specialists who have helped to improve the conditions for book assessments in research evaluation exercises, with empirically based data and insights supporting their greater integration. Our review highlights the research carried out by four types of expert communities, referred to as the monitors, the subject classifiers, the indexers and the indicator constructionists. Many challenges lie ahead for scholars affiliated with these communities, particularly the latter three. By acknowledging their unique, yet interrelated roles, we show where the greatest potential is for both quantitative and qualitative indicator advancements in book-inclusive evaluation systems.Comment: Forthcoming in Glanzel, W., Moed, H.F., Schmoch U., Thelwall, M. (2018). Springer Handbook of Science and Technology Indicators. Springer Some corrections made in subsection 'Publisher prestige or quality
    • …
    corecore