13 research outputs found
Evaluating Citation Functions in CiTO: Cognitive Issues
Abstract. Networks of citations are a key tool for referencing, dissem-inating and evaluating research results. The task of characterising the functional role of citations in scientific literature is very difficult, not only for software agents but for humans, too. The main problem is that the mental models of different annotators hardly ever converge to a single shared opinion. The goal of this paper is to investigate how an existing reference model for classifying citations, namely CiTO (Citation Typing Ontology), is interpreted and used by annotators of scientific literature. We present an experiment capturing the cognitive processes behind sub-jects ’ decisions in annotating papers with CiTO, and we provide initial ideas to refine future releases of CiTO
Evaluating Citation Functions in CiTO: Cognitive Issues.
Abstract. Networks of citations are a key tool for referencing, dissem-inating and evaluating research results. The task of characterising the functional role of citations in scientific literature is very difficult, not only for software agents but for humans, too. The main problem is that the mental models of different annotators hardly ever converge to a single shared opinion. The goal of this paper is to investigate how an existing reference model for classifying citations, namely CiTO (Citation Typing Ontology), is interpreted and used by annotators of scientific literature. We present an experiment capturing the cognitive processes behind sub-jects ’ decisions in annotating papers with CiTO, and we provide initial ideas to refine future releases of CiTO
Absolute and specific measures of research group excellence
A desirable goal of scientific management is to introduce, if it exists, a
simple and reliable way to measure the scientific excellence of publicly-funded
research institutions and universities to serve as a basis for their ranking
and financing. While citation-based indicators and metrics are easily
accessible, they are far from being universally accepted as way to automate or
inform evaluation processes or to replace evaluations based on peer review.
Here we consider absolute measurements of research excellence at an
amalgamated, institutional level and specific measures of research excellence
as performance per head. Using biology research institutions in the UK as a
test case, we examine the correlations between peer-review-based and
citation-based measures of research excellence on these two scales. We find
that citation-based indicators are very highly correlated with peer-evaluated
measures of group strength but are poorly correlated with group quality. Thus,
and almost paradoxically, our analysis indicates that citation counts could
possibly form a basis for deciding on how to fund research institutions but
they should not be used as a basis for ranking them in terms of quality