187 research outputs found

    Towards more consistent, transparent, and multipurpose national bibliographic databases for research output

    Get PDF
    National bibliographic databases for research output collect metadata on universities’ scholarly publications, such as journal articles, monographs, and conference papers. As this sort of research information is increasingly used in assessments, funding allocation, and other academic reward structures, the value in developing comprehensive and reliable national databases becomes more and more clear. Linda Sīle, Raf Guns and Tim Engels outline the challenges faced by those developing national bibliographic databases for research output, from the need for reliable (persistent) identifiers, through to the new and evolving contexts for data use

    Indicating interdisciplinarity: A multidimensional framework to characterize Interdisciplinary Knowledge Flow (IKF)

    Full text link
    This study contributes to the recent discussions on indicating interdisciplinarity, i.e., going beyond mere metrics of interdisciplinarity. We propose a multi-dimensional and contextual framework to improve the granularity and usability of the existing methodology for quantifying the interdisciplinary knowledge flow (IKF) in which scientific disciplines import and export knowledge from/to other disciplines. To characterize the knowledge exchange between disciplines, we recognize three dimensions under this framework, namely, broadness, intensity, and heterogeneity. We show that each dimension covers a different aspect of IKF, especially between disciplines with the largest volume of IKF, and can assist in uncovering different types of interdisciplinarity. We apply this framework in two use cases, one at the level of disciplines and one at the level of journals, to show how it can offer a more holistic and detailed viewpoint on the interdisciplinarity of scientific entities than plain citation counts. We further compare our proposed framework, an indicating process, with established indicators and discuss how such information tools on interdisciplinarity can assist science policy practices such as performance-based research funding systems and panel-based peer review processes

    The use of Gold Open Access in four European countries: An analysis at the level of articles

    Get PDF
    We assess the use and potential of Gold Open Access (OA) in Finland, Flanders (Belgium), Norway, and Poland by comparing data at the level of articles from full-coverage databases in each country. The inclusion of the journals in the Directory of Open Access Journals (DOAJ) is used as a reference to determine Gold Open Access. Gold OA is on the rise in all four countries and across fields, but some countries, especially Norway, and some fields have a substantially larger proportion of OA publications than others, with the overall share of Gold OA ranging from 5.7% to 17.3%. Especially in the SSH, a mixture of local and international journals can be found, many of which are not indexed in databases like Web of Science. As such, our results indicate that an overview of the state of Gold OA is preferably obtained by comparing DOAJ to a full-coverage database

    Identifying publications in questionable journals in the context of performance-based research funding

    Get PDF
    In this article we discuss the five yearly screenings for publications in questionable journals which have been carried out in the context of the performance-based research funding model in Flanders, Belgium. The Flemish funding model expanded from 2010 onwards, with a comprehensive bibliographic database for research output in the social sciences and humanities. Along with an overview of the procedures followed during the screenings for articles in questionable journals submitted for inclusion in this database, we present a bibliographic analysis of the publications identified. First, we show how the yearly number of publications in questionable journals has evolved over the period 2003–2016. Second, we present a disciplinary classification of the identified journals. In the third part of the results section, three authorship characteristics are discussed: multi-authorship, the seniority–or experience level–of authors in general and of the first author in particular, and the relation of the disciplinary scope of the journal (cognitive classification) with the departmental affiliation of the authors (organizational classification). Our results regarding yearly rates of publications in questionable journals indicate that awareness of the risks of questionable journals does not lead to a turn away from open access in general. The number of publications in open access journals rises every year, while the number of publications in questionable journals decreases from 2012 onwards. We find further that both early career and more senior researchers publish in questionable journals. We show that the average proportion of senior authors contributing to publications in questionable journals is somewhat higher than that for publications in open access journals. In addition, this paper yields insight into the extent to which publications in questionable journals pose a threat to the public and political legitimacy of a performance-based research funding system of a western European region. We include concrete suggestions for those tasked with maintaining bibliographic databases and screening for publications in questionable journals

    Measuring cognitive distance between publication portfolios

    Get PDF
    We study the problem of determining the cognitive distance between the publication portfolios of two units. In this article we provide a systematic overview of five different methods (a benchmark Euclidean distance approach, distance between barycenters in two and in three dimensions, distance between similarity-adapted publication vectors, and weighted cosine similarity) to determine cognitive distances using publication records. We present a theoretical comparison as well as a small empirical case study. Results of this case study are not conclusive, but we have, mainly on logical grounds, a small preference for the method based on similarity-adapted publication vectors

    Measuring the match between evaluators and evaluees: Cognitive distances between panel members and research groups at the journal level

    Get PDF
    When research groups are evaluated by an expert panel, it is an open question how one can determine the match between panel and research groups. In this paper, we outline two quantitative approaches that determine the cognitive distance between evaluators and evaluees, based on the journals they have published in. We use example data from four research evaluations carried out between 2009 and 2014 at the University of Antwerp. While the barycenter approach is based on a journal map, the similarity-adapted publication vector (SAPV) approach is based on the full journal similarity matrix. Both approaches determine an entity's profile based on the journals in which it has published. Subsequently, we determine the Euclidean distance between the barycenter or SAPV profiles of two entities as an indicator of the cognitive distance between them. Using a bootstrapping approach, we determine confidence intervals for these distances. As such, the present article constitutes a refinement of a previous proposal that operates on the level of Web of Science subject categories

    Corrigendum to “Is the expertise of evaluation panels congruent with the research interests of the research groups: a quantitative approach based on barycenters” [Journal of Informetrics 9(4) (2015) 704-721]

    Get PDF
    In Rahman, Guns, Rousseau, and Engels (2015) we described several approaches to determine the cognitive distance between two units. One of these approaches was based on what we called barycenters in N dimensions. This note corrects this terminology and introduces the more adequate term ‘similarity-adapted publication vectors’

    Predatory Open Access journals: A review of past screenings within the Flemish performance based research funding system (2014 – 2018)

    Get PDF
    From 2013 – 2014 onwards, our group (ECOOM - UAntwerpen) has been monitoring Predatory Open Access publication patterns in Flemish (Belgium) SSH scholarship. In light of the Flemish Performance Based Research Funding System, these screening exercises are conducted to assist university review boards with the decision-making processes concerning what is and what is not to be considered a peer reviewed periodical. Each year, the results of these monitoring exercises than, are published in as a report, and presented to the Authoritative Penal. In the introductory part of this essay, we will present a general background against which these yearly screenings emerged. Second, we will present the sources used and the methods deployed for the yearly screenings. Thereafter, we will shortly present the yearly results these exercises yielded. In the third section, we present a more comprehensive analysis of the results. We conclude with reflecting on the past exercises and the findings presented in this report, and discuss some implications for colleagues and scholars manoeuvring through the contemporary journal landscape
    corecore