85,547 research outputs found
Search for Evergreens in Science: A Functional Data Analysis
Evergreens in science are papers that display a continual rise in annual
citations without decline, at least within a sufficiently long time period.
Aiming to better understand evergreens in particular and patterns of citation
trajectory in general, this paper develops a functional data analysis method to
cluster citation trajectories of a sample of 1699 research papers published in
1980 in the American Physical Society (APS) journals. We propose a functional
Poisson regression model for individual papers' citation trajectories, and fit
the model to the observed 30-year citations of individual papers by functional
principal component analysis and maximum likelihood estimation. Based on the
estimated paper-specific coefficients, we apply the K-means clustering
algorithm to cluster papers into different groups, for uncovering general types
of citation trajectories. The result demonstrates the existence of an evergreen
cluster of papers that do not exhibit any decline in annual citations over 30
years.Comment: 40 pages, 9 figure
How reliable are systematic reviews in empirical software engineering?
BACKGROUND – the systematic review is becoming a more commonly employed research instrument in
empirical software engineering. Before undue reliance is placed on the outcomes of such reviews it would seem useful to consider the robustness of the approach in this particular research context.
OBJECTIVE – the aim of this study is to assess the reliability of systematic reviews as a research instrument. In particular we wish to investigate the consistency of process and the stability of outcomes.
METHOD – we compare the results of two independent reviews under taken with a common research question.
RESULTS – the two reviews find similar answers to the research question, although the means of arriving at those answers vary.
CONCLUSIONS – in addressing a well-bounded research question, groups of researchers with similar domain experience can arrive at the same review outcomes, even though they may do so in different ways.
This provides evidence that, in this context at least, the systematic review is a robust research method
Comparing the writing style of real and artificial papers
Recent years have witnessed the increase of competition in science. While
promoting the quality of research in many cases, an intense competition among
scientists can also trigger unethical scientific behaviors. To increase the
total number of published papers, some authors even resort to software tools
that are able to produce grammatical, but meaningless scientific manuscripts.
Because automatically generated papers can be misunderstood as real papers, it
becomes of paramount importance to develop means to identify these scientific
frauds. In this paper, I devise a methodology to distinguish real manuscripts
from those generated with SCIGen, an automatic paper generator. Upon modeling
texts as complex networks (CN), it was possible to discriminate real from fake
papers with at least 89\% of accuracy. A systematic analysis of features
relevance revealed that the accessibility and betweenness were useful in
particular cases, even though the relevance depended upon the dataset. The
successful application of the methods described here show, as a proof of
principle, that network features can be used to identify scientific gibberish
papers. In addition, the CN-based approach can be combined in a straightforward
fashion with traditional statistical language processing methods to improve the
performance in identifying artificially generated papers.Comment: To appear in Scientometrics (2015
Software tools for conducting bibliometric analysis in science: An up-to-date review
Bibliometrics has become an essential tool for assessing and analyzing the output of scientists, cooperation between
universities, the effect of state-owned science funding on national research and development performance and educational
efficiency, among other applications. Therefore, professionals and scientists need a range of theoretical and practical
tools to measure experimental data. This review aims to provide an up-to-date review of the various tools available
for conducting bibliometric and scientometric analyses, including the sources of data acquisition, performance analysis
and visualization tools. The included tools were divided into three categories: general bibliometric and performance
analysis, science mapping analysis, and libraries; a description of all of them is provided. A comparative analysis of the
database sources support, pre-processing capabilities, analysis and visualization options were also provided in order to
facilitate its understanding. Although there are numerous bibliometric databases to obtain data for bibliometric and
scientometric analysis, they have been developed for a different purpose. The number of exportable records is between
500 and 50,000 and the coverage of the different science fields is unequal in each database. Concerning the analyzed
tools, Bibliometrix contains the more extensive set of techniques and suitable for practitioners through Biblioshiny.
VOSviewer has a fantastic visualization and is capable of loading and exporting information from many sources. SciMAT
is the tool with a powerful pre-processing and export capability. In views of the variability of features, the users need to
decide the desired analysis output and chose the option that better fits into their aims
The ghosts I called I can\u27t get rid of now : The Keynes-Tinbergen-Friedman-Phillips Critique of Keynesian Macroeconometrics
This chapter offers a fresh perspective on the much publicised dispute between those followers of Keynes who presented econometric evidence in favour of a Phillips curve trade-off, and those monetarists who presented counter econometric evidence. Contrary to common perceptions, the collapse of the Keynesian Phillips curve was a vindication of a common critique of macroeconometric practices, which was jointly authored by John Maynard Keynes, Jan Tinbergen, Milton Friedman and A.W.H. \u27Bill\u27 Phillips. This analysis is informed by the usual sources, plus two sources which had been thought to be no longer in existence (Phillips\u27 private papers and the London School of Economics (LSE) Methodology, Measurement and Testing (M2T) Staff Seminar records), plus two essays by Keynes (1938a, 1938b) which have been overlooked in this context.
ISBN: 033373045
- …