67 research outputs found
LIFE evaluation report: baseline bibliometric analysis
Issued as final reportArizona State Universit
How journal rankings can suppress interdisciplinary research. A comparison between Innovation Studies and Business & Management
This study provides quantitative evidence on how the use of journal rankings
can disadvantage interdisciplinary research in research evaluations. Using
publication and citation data, it compares the degree of interdisciplinarity
and the research performance of a number of Innovation Studies units with that
of leading Business & Management schools in the UK. On the basis of various
mappings and metrics, this study shows that: (i) Innovation Studies units are
consistently more interdisciplinary in their research than Business &
Management schools; (ii) the top journals in the Association of Business
Schools' rankings span a less diverse set of disciplines than lower-ranked
journals; (iii) this results in a more favourable assessment of the performance
of Business & Management schools, which are more disciplinary-focused. This
citation-based analysis challenges the journal ranking-based assessment. In
short, the investigation illustrates how ostensibly 'excellence-based' journal
rankings exhibit a systematic bias in favour of mono-disciplinary research. The
paper concludes with a discussion of implications of these phenomena, in
particular how the bias is likely to affect negatively the evaluation and
associated financial resourcing of interdisciplinary research organisations,
and may result in researchers becoming more compliant with disciplinary
authority over time.Comment: 41 pages, 10 figure
The Distribution of the Asymptotic Number of Citations to Sets of Publications by a Researcher or From an Academic Department Are Consistent With a Discrete Lognormal Model
How to quantify the impact of a researcher's or an institution's body of work
is a matter of increasing importance to scientists, funding agencies, and
hiring committees. The use of bibliometric indicators, such as the h-index or
the Journal Impact Factor, have become widespread despite their known
limitations. We argue that most existing bibliometric indicators are
inconsistent, biased, and, worst of all, susceptible to manipulation. Here, we
pursue a principled approach to the development of an indicator to quantify the
scientific impact of both individual researchers and research institutions
grounded on the functional form of the distribution of the asymptotic number of
citations. We validate our approach using the publication records of 1,283
researchers from seven scientific and engineering disciplines and the chemistry
departments at the 106 U.S. research institutions classified as "very high
research activity". Our approach has three distinct advantages. First, it
accurately captures the overall scientific impact of researchers at all career
stages, as measured by asymptotic citation counts. Second, unlike other
measures, our indicator is resistant to manipulation and rewards publication
quality over quantity. Third, our approach captures the time-evolution of the
scientific impact of research institutions.Comment: 20 pages, 11 figures, 3 table
University departments evaluation: a multivariate approach
Aim of the paper is to present a new model, based on multivariate statistic analyses, allowing to express a synthetic judgement on Departments activities by taking into consideration the whole set of indicators describing them both as aggregations of researchers and as University autonomous organs. The model, based on Principal Component Analysis and Cluster Analysis, allows both to explain the determinants of Departments performances, and to classify them into homogeneous groups. The paper shows the results obtained by testing the proposed model on University of Naples “L’Orientale” Departments, using data extracted by the 2007 assessment report to the Ministry of University and Research.Evaluation, Departments, Multivariate statistics
How Journal Rankings can suppress Interdisciplinary Research – A Comparison between Innovation Studies and Business & Management
This study provides new quantitative evidence on how journal rankings can disadvantage interdisciplinary research during research evaluations. Using publication data, it compares the degree of interdisciplinarity and the research performance of innovation studies units with business and management schools in the UK. Using various mappings and metrics, this study shows that: (i) innovation studies units are consistently more interdisciplinary than business and management schools; (ii) the top journals in the Association of Business Schools’ rankings span a less diverse set of disciplines than lower ranked journals; (iii) this pattern results in a more favourable performance assessment of the business and management schools, which are more disciplinary-focused. Lastly, it demonstrates how a citation-based analysis challenges the ranking-based assessment. In summary, the investigation illustrates how ostensibly ‘excellence-based’ journal rankings have a systematic bias in favour of mono-disciplinary research. The paper concludes with a discussion of implications of these phenomena, in particular how resulting bias is likely to affect negatively the evaluation and associated financial resourcing of interdisciplinary organisations, and may encourage researchers to be more compliant with disciplinary authority.Interdisciplinary, Evaluation, Ranking, Innovation, Bibliometrics, REF
Software tools for conducting bibliometric analysis in science: An up-to-date review
Bibliometrics has become an essential tool for assessing and analyzing the output of scientists, cooperation between
universities, the effect of state-owned science funding on national research and development performance and educational
efficiency, among other applications. Therefore, professionals and scientists need a range of theoretical and practical
tools to measure experimental data. This review aims to provide an up-to-date review of the various tools available
for conducting bibliometric and scientometric analyses, including the sources of data acquisition, performance analysis
and visualization tools. The included tools were divided into three categories: general bibliometric and performance
analysis, science mapping analysis, and libraries; a description of all of them is provided. A comparative analysis of the
database sources support, pre-processing capabilities, analysis and visualization options were also provided in order to
facilitate its understanding. Although there are numerous bibliometric databases to obtain data for bibliometric and
scientometric analysis, they have been developed for a different purpose. The number of exportable records is between
500 and 50,000 and the coverage of the different science fields is unequal in each database. Concerning the analyzed
tools, Bibliometrix contains the more extensive set of techniques and suitable for practitioners through Biblioshiny.
VOSviewer has a fantastic visualization and is capable of loading and exporting information from many sources. SciMAT
is the tool with a powerful pre-processing and export capability. In views of the variability of features, the users need to
decide the desired analysis output and chose the option that better fits into their aims
Recommended from our members
The Impact of Industry Collaboration on Academic Research Output: A Dynamic Panel Data Analysis
The aim of this paper is to analyse the impact of university knowledge and technology transfer activities on academic research output. Specifically, we study whether researchers with collaborative links with the private sector publish less than their peers without such links, once controlling for other sources of heterogeneity. We report findings from a longitudinal dataset on researchers from two engineering departments in the UK between 1985 until 2006. Our results indicate that researchers with industrial links publish significantly more than their peers. Academic productivity, though, is higher for low levels of industry involvement as
compared to high levels
MODELS FOR MEASURING THE RESEARCH PERFORMANCE AND MANAGEMENT OF THE PUBLIC LABS
The science sector, in some European countries, is doing a strategic restructuring due to budget cuts (e.g. Italy). Thus, the measure and evaluation of research performance (metrics) of its units (public research institute) is needed. General models to assess the R&D performance of a public research lab are presented here. The methodology uses the discriminant analysis and the results are two canonical discriminant functions (direct and Wilks methods) that could provide indications about the performance of research bodies. The functions are successfully applied to 200 public research institutes belonging to the Italian National Research Council. These functions are also tools for appropriate decisions and actions to improve research performance, especially by the more effective use of existing resources and for reducing the X-inefficiency. Some policy and management implications are discussed.Research performance, Performance measurement, Performance indicators, R&D evaluation, Public research lab, Discriminant analysis, X-inefficiency
Serbia Robotics Hall of Fame: the Impact of the Past
The paper presents a list of the most influential works of Serbian robotics. The list has been synthesized using document citation data from the Elsevier Scopus database and it shows that the impact of Serbia, compared to neighboring countries, has been highly disproportional and outperforming having in mind the Serbian economic and general scientific strength. However, the analysis also reveals that Serbia’s contribution has been significantly weakened during the last ten years and that Serbian robotics has been declining with respect to neighboring countries
- …