987 research outputs found
Investigating the interplay between fundamentals of national research systems: performance, investments and international collaborations
We discuss, at the macro-level of nations, the contribution of research
funding and rate of international collaboration to research performance, with
important implications for the science of science policy. In particular, we
cross-correlate suitable measures of these quantities with a
scientometric-based assessment of scientific success, studying both the average
performance of nations and their temporal dynamics in the space defined by
these variables during the last decade. We find significant differences among
nations in terms of efficiency in turning (financial) input into
bibliometrically measurable output, and we confirm that growth of international
collaboration positively correlate with scientific success, with significant
benefits brought by EU integration policies. Various geo-cultural clusters of
nations naturally emerge from our analysis. We critically discuss the possible
factors that potentially determine the observed patterns
A review of the characteristics of 108 author-level bibliometric indicators
An increasing demand for bibliometric assessment of individuals has led to a
growth of new bibliometric indicators as well as new variants or combinations
of established ones. The aim of this review is to contribute with objective
facts about the usefulness of bibliometric indicators of the effects of
publication activity at the individual level. This paper reviews 108 indicators
that can potentially be used to measure performance on the individual author
level, and examines the complexity of their calculations in relation to what
they are supposed to reflect and ease of end-user application.Comment: to be published in Scientometrics, 201
The Distribution of the Asymptotic Number of Citations to Sets of Publications by a Researcher or From an Academic Department Are Consistent With a Discrete Lognormal Model
How to quantify the impact of a researcher's or an institution's body of work
is a matter of increasing importance to scientists, funding agencies, and
hiring committees. The use of bibliometric indicators, such as the h-index or
the Journal Impact Factor, have become widespread despite their known
limitations. We argue that most existing bibliometric indicators are
inconsistent, biased, and, worst of all, susceptible to manipulation. Here, we
pursue a principled approach to the development of an indicator to quantify the
scientific impact of both individual researchers and research institutions
grounded on the functional form of the distribution of the asymptotic number of
citations. We validate our approach using the publication records of 1,283
researchers from seven scientific and engineering disciplines and the chemistry
departments at the 106 U.S. research institutions classified as "very high
research activity". Our approach has three distinct advantages. First, it
accurately captures the overall scientific impact of researchers at all career
stages, as measured by asymptotic citation counts. Second, unlike other
measures, our indicator is resistant to manipulation and rewards publication
quality over quantity. Third, our approach captures the time-evolution of the
scientific impact of research institutions.Comment: 20 pages, 11 figures, 3 table
The scientific influence of nations on global scientific and technological development
Determining how scientific achievements influence the subsequent process of
knowledge creation is a fundamental step in order to build a unified ecosystem
for studying the dynamics of innovation and competitiveness. Relying separately
on data about scientific production on one side, through bibliometric
indicators, and about technological advancements on the other side, through
patents statistics, gives only a limited insight on the key interplay between
science and technology which, as a matter of fact, move forward together within
the innovation space. In this paper, using citation data of both research
papers and patents, we quantify the direct influence of the scientific outputs
of nations on further advancements in science and on the introduction of new
technologies. Our analysis highlights the presence of geo-cultural clusters of
nations with similar innovation system features, and unveils the heterogeneous
coupled dynamics of scientific and technological advancements. This study
represents a step forward in the buildup of an inclusive framework for
knowledge creation and innovation
A Review of Theory and Practice in Scientometrics
Scientometrics is the study of the quantitative aspects of the process of science as a communication system. It is centrally, but not only, concerned with the analysis of citations in the academic literature. In recent years it has come to play a major role in the measurement and evaluation of research performance. In this review we consider: the historical development of scientometrics, sources of citation data, citation metrics and the âlaws" of scientometrics, normalisation, journal impact factors and other journal metrics, visualising and mapping science, evaluation and policy, and future developments
The success-index: an alternative approach to the h-index for evaluating an individual's research output
Among the most recent bibliometric indicators for normalizing the differences among fields of science in terms of citation behaviour, Kosmulski (J Informetr 5(3):481-485, 2011) proposed the NSP (number of successful paper) index. According to the authors, NSP deserves much attention for its great simplicity and immediate meaningâ equivalent to those of the h-indexâwhile it has the disadvantage of being prone to manipulation and not very efficient in terms of statistical significance. In the first part of the paper, we introduce the success-index, aimed at reducing the NSP-index's limitations, although requiring more computing effort. Next, we present a detailed analysis of the success-index from the point of view of its operational properties and a comparison with the h-index's ones. Particularly interesting is the examination of the success-index scale of measurement, which is much richer than the h-index's. This makes success-index much more versatile for different types of analysisâe.g., (cross-field) comparisons of the scientific output of (1) individual researchers, (2) researchers with different seniority, (3) research institutions of different size, (4) scientific journals, etc
Bibliometric Evaluation vs. Informed Peer Review: Evidence from Italy
A relevant question for the organization of large scale research assessments is whether bibliometric evaluation and informed peer review where reviewers know where the work was published, yield similar results. It would suggest, for instance, that less costly bibliometric evaluation might - at least partly - replace informed peer review, or that bibliometric evaluation could reliably monitor research in between assessment exercises. We draw on our experience of evaluating Italian research in Economics, Business and Statistics, where almost 12,000 publications dated 2004-2010 were assessed. A random sample from the available population of journal articles shows that informed peer review and bibliometric analysis produce similar evaluations of the same set of papers. Whether because of independent convergence in assessment, or the influence of bibliometric information on the community of reviewers, the implication for the organization of these exercises is that these two approaches are substitutes
Bibliometric Evaluation vs. Informed Peer Review:Evidence from Italy
A relevant question for the organization of large scale research assessments is whether
bibliometric evaluation and informed peer review where reviewers know where the work was
published, yield similar results. It would suggest, for instance, that less costly bibliometric
evaluation might - at least partly - replace informed peer review, or that bibliometric
evaluation could reliably monitor research in between assessment exercises. We draw on our
experience of evaluating Italian research in Economics, Business and Statistics, where almost
12,000 publications dated 2004-2010 were assessed. A random sample from the available
population of journal articles shows that informed peer review and bibliometric analysis
produce similar evaluations of the same set of papers. Whether because of independent
convergence in assessment, or the influence of bibliometric information on the community of
reviewers, the implication for the organization of these exercises is that these two approaches
are substitutes
- âŚ