30 research outputs found
Applied Evaluative Informetrics: Part 1
This manuscript is a preprint version of Part 1 (General Introduction and
Synopsis) of the book Applied Evaluative Informetrics, to be published by
Springer in the summer of 2017. This book presents an introduction to the field
of applied evaluative informetrics, and is written for interested scholars and
students from all domains of science and scholarship. It sketches the field's
history, recent achievements, and its potential and limits. It explains the
notion of multi-dimensional research performance, and discusses the pros and
cons of 28 citation-, patent-, reputation- and altmetrics-based indicators. In
addition, it presents quantitative research assessment as an evaluation
science, and focuses on the role of extra-informetric factors in the
development of indicators, and on the policy context of their application. It
also discusses the way forward, both for users and for developers of
informetric tools.Comment: The posted version is a preprint (author copy) of Part 1 (General
Introduction and Synopsis) of a book entitled Applied Evaluative
Bibliometrics, to be published by Springer in the summer of 201
Grand challenges in altmetrics : heterogeneity, data quality and dependencies
With increasing uptake among researchers, social media are finding their way into
scholarly communication and, under the umbrella term altmetrics, are starting to be utilized in
research evaluation. Fueled by technological possibilities and an increasing demand to
demonstrate impact beyond the scientific community, altmetrics have received great attention
as potential democratizers of the scientific reward system and indicators of societal impact. This
paper focuses on the current challenges for altmetrics. Heterogeneity, data quality and particular dependencies are identified as the three major issues and discussed in detail with an
emphasis on past developments in bibliometrics. The heterogeneity of altmetrics reflects the
diversity of the acts and online events, most of which take place on social media platforms. This
heterogeneity has made it difficult to establish a common definition or conceptual framework.
Data quality issues become apparent in the lack of accuracy, consistency and replicability of
various altmetrics, which is largely affected by the dynamic nature of social media events.
Furthermore altmetrics are shaped by technical possibilities and are particularly dependent on
the availability of APIs and DOIs, strongly dependent on data providers and aggregators, and
potentially influenced by the technical affordances of underlying platforms
Do altmetrics promote Open Access? An exploratory analysis on altmetric differences between types of access in the field of Physics
The promotion of Open Science needs new metrics that encourage openness in scientific practices, and can help institutions to monitor it. In 2017, the European Commission (EC) created an Expert Group with the task of informing the commission on the possibility of including altmetric indicators as potential metrics that could foster and monitor open science advancements, but it failed to show how these metrics can help to foster Open Science. The current paper analyses differences in altmetric scores between Green OA publications, Gold OA publications and non OA publications. The goal of the paper is to empirically study whether altmetric indicators reinforce Open Access practices regardless of the type of access. We report a preliminary analysis based on two Physics journals. Our results show that gold OA documents are best covered in Altmetric.com and receive higher mentions than documents with other types of access. This is especially troublesome in the case of green OA, as it reflects that altmetric indicators do promote a very specific type of access closely linked with the publishing industry
Do altmetrics promote Open Access? An exploratory analysis on altmetric differences between types of access in the field of Physics
The promotion of Open Science needs new metrics that encourage openness in scientific practices, and can help institutions to monitor it. In 2017, the European Commission (EC) created an Expert Group with the task of informing the commission on the possibility of including altmetric indicators as potential metrics that could foster and monitor open science advancements, but it failed to show how these metrics can help to foster Open Science. The current paper analyses differences in altmetric scores between Green OA publications, Gold OA publications and non OA publications. The goal of the paper is to empirically study whether altmetric indicators reinforce Open Access practices regardless of the type of access. We report a preliminary analysis based on two Physics journals. Our results show that gold OA documents are best covered in Altmetric.com and receive higher mentions than documents with other types of access. This is especially troublesome in the case of green OA, as it reflects that altmetric indicators do promote a very specific type of access closely linked with the publishing industry
Altmetrics as an Answer to the Need for Democratization of Research and Its Evaluation
In the evaluation of research, the same unequal structure present in the production of research is reproduced. Despite a few very productive researchers (in terms of papers and citations received), there are also few researchers who are involved in the research evaluation process (in terms of being editorial board members of journals or reviewers). To produce a high number of papers and receive many citations and to be involved in the evaluation of research papers, you need to be in the minority of giants who have high productivity and more scientific success. As editorial board members and reviewers, we often find the same minority of giants. In this paper, we apply an economic approach to interpret recent trends in research evaluation and derive a new interpretation of Altmetrics as a response to the need for democratization of research and its evaluation. In this context, the majority of pygmies can participate in evaluation with Altmetrics, whose use is more democratic, that is, much wider and open to all
Theories of Informetrics and Scholarly Communication
Scientometrics have become an essential element in the practice and evaluation of science and research, including both the evaluation of individuals and national assessment exercises. Yet, researchers and practitioners in this field have lacked clear theories to guide their work. As early as 1981, then doctoral student Blaise Cronin published "The need for a theory of citing" —a call to arms for the fledgling scientometric community to produce foundational theories upon which the work of the field could be based. More than three decades later, the time has come to reach out the field again and ask how they have responded to this call.
This book compiles the foundational theories that guide informetrics and scholarly communication research. It is a much needed compilation by leading scholars in the field that gathers together the theories that guide our understanding of authorship, citing, and impact
Theories of Informetrics and Scholarly Communication
Scientometrics have become an essential element in the practice and evaluation of science and research, including both the evaluation of individuals and national assessment exercises. Yet, researchers and practitioners in this field have lacked clear theories to guide their work. As early as 1981, then doctoral student Blaise Cronin published The need for a theory of citing - a call to arms for the fledgling scientometric community to produce foundational theories upon which the work of the field could be based. More than three decades later, the time has come to reach out the field again and ask how they have responded to this call. This book compiles the foundational theories that guide informetrics and scholarly communication research. It is a much needed compilation by leading scholars in the field that gathers together the theories that guide our understanding of authorship, citing, and impact
The Many Publics of Science: Using Altmetrics to Identify Common Communication Channels by Scientific field
Altmetrics have led to new quantitative studies of science through social
media interactions. However, there are no models of science communication that
respond to the multiplicity of non-academic channels. Using the 3653 authors
with the highest volume of altmetrics mentions from the main channels (Twitter,
News, Facebook, Wikipedia, Blog, Policy documents, and Peer reviews) to their
publications (2016-2020), it has been analyzed where the audiences of each
discipline are located. The results evidence the generalities and specificities
of these new communication models and the differences between areas. These
findings are useful for the development of science communication policies and
strategies
Theories of Informetrics and Scholarly Communication
Scientometrics have become an essential element in the practice and evaluation of science and research, including both the evaluation of individuals and national assessment exercises. Yet, researchers and practitioners in this field have lacked clear theories to guide their work. As early as 1981, then doctoral student Blaise Cronin published "The need for a theory of citing" —a call to arms for the fledgling scientometric community to produce foundational theories upon which the work of the field could be based. More than three decades later, the time has come to reach out the field again and ask how they have responded to this call.
This book compiles the foundational theories that guide informetrics and scholarly communication research. It is a much needed compilation by leading scholars in the field that gathers together the theories that guide our understanding of authorship, citing, and impact