1,475,663 research outputs found

    Large-Scale Analysis of the Accuracy of the Journal Classification Systems of Web of Science and Scopus

    Full text link
    Journal classification systems play an important role in bibliometric analyses. The two most important bibliographic databases, Web of Science and Scopus, each provide a journal classification system. However, no study has systematically investigated the accuracy of these classification systems. To examine and compare the accuracy of journal classification systems, we define two criteria on the basis of direct citation relations between journals and categories. We use Criterion I to select journals that have weak connections with their assigned categories, and we use Criterion II to identify journals that are not assigned to categories with which they have strong connections. If a journal satisfies either of the two criteria, we conclude that its assignment to categories may be questionable. Accordingly, we identify all journals with questionable classifications in Web of Science and Scopus. Furthermore, we perform a more in-depth analysis for the field of Library and Information Science to assess whether our proposed criteria are appropriate and whether they yield meaningful results. It turns out that according to our citation-based criteria Web of Science performs significantly better than Scopus in terms of the accuracy of its journal classification system

    Construction of a Pragmatic Base Line for Journal Classifications and Maps Based on Aggregated Journal-Journal Citation Relations

    Full text link
    A number of journal classification systems have been developed in bibliometrics since the launch of the Citation Indices by the Institute of Scientific Information (ISI) in the 1960s. These systems are used to normalize citation counts with respect to field-specific citation patterns. The best known system is the so-called "Web-of-Science Subject Categories" (WCs). In other systems papers are classified by algorithmic solutions. Using the Journal Citation Reports 2014 of the Science Citation Index and the Social Science Citation Index (n of journals = 11,149), we examine options for developing a new system based on journal classifications into subject categories using aggregated journal-journal citation data. Combining routines in VOSviewer and Pajek, a tree-like classification is developed. At each level one can generate a map of science for all the journals subsumed under a category. Nine major fields are distinguished at the top level. Further decomposition of the social sciences is pursued for the sake of example with a focus on journals in information science (LIS) and science studies (STS). The new classification system improves on alternative options by avoiding the problem of randomness in each run that has made algorithmic solutions hitherto irreproducible. Limitations of the new system are discussed (e.g. the classification of multi-disciplinary journals). The system's usefulness for field-normalization in bibliometrics should be explored in future studies.Comment: accepted for publication in the Journal of Informetrics, 20 July 201

    Revisiting h measured on UK LIS and IR academics

    Get PDF
    A brief communication appearing in this journal ranked UK LIS and (some) IR academics by their h-index using data derived from Web of Science. In this brief communication, the same academics were re-ranked, using other popular citation databases. It was found that for academics who publish more in computer science forums, their h was significantly different due to highly cited papers missed by Web of Science; consequently their rank changed substantially. The study was widened to a broader set of UK LIS and IR academics where results showed similar statistically significant differences. A variant of h, hmx, was introduced that allowed a ranking of the academics using all citation databases together

    Journal Maps, Interactive Overlays, and the Measurement of Interdisciplinarity on the Basis of Scopus Data (1996-2012)

    Get PDF
    Using Scopus data, we construct a global map of science based on aggregated journal-journal citations from 1996-2012 (N of journals = 20,554). This base map enables users to overlay downloads from Scopus interactively. Using a single year (e.g., 2012), results can be compared with mappings based on the Journal Citation Reports at the Web-of-Science (N = 10,936). The Scopus maps are more detailed at both the local and global levels because of their greater coverage, including, for example, the arts and humanities. The base maps can be interactively overlaid with journal distributions in sets downloaded from Scopus, for example, for the purpose of portfolio analysis. Rao-Stirling diversity can be used as a measure of interdisciplinarity in the sets under study. Maps at the global and the local level, however, can be very different because of the different levels of aggregation involved. Two journals, for example, can both belong to the humanities in the global map, but participate in different specialty structures locally. The base map and interactive tools are available online (with instructions) at http://www.leydesdorff.net/scopus_ovl.Comment: accepted for publication in the Journal of the Association for Information Science and Technology (JASIST

    Article level metrics: a look beyond the journal impact factor

    Get PDF
    The journal Impact Factor (IF), developed by Eugene Garfield at the Institute for Scientific Information (ISI), reflects the average number of times articles from the journal published in the past two years have been cited in the Journal Citation Reports (JCR) year. The Impact Factor is calculated by dividing the number of citations in the JCR year by the total number of articles published in the two previous years. For example, if there were 200 papers published in a journal in 2013 and 2014 and there were 400 citations in that time period, then the 2015 IF for the journal would be 2. Impact Factor uses Thomson Reuters (ISI Web of Knowledge) citation data. The Impact factor citation data was first derived from the Science Citation Index, a citation index created by Garfield and produced by the Institute for Scientific Information (ISI). ISI was later acquired by Thomson Reuters along with the Science Citation Index, which Reuters grew into the Science Citation Index Expanded. That index is now housed in the Web of Science, a subscription-based scientific citation indexing service encompassing six other online databases. Today, Thomson Reuters calculates IFs using the data from all of the journals indexed in the Web of Science, and releases an IF listing on an annual basis in its yearly Journal Citation Reports, which is available with paid Web of Science subscriptions

    Impacto de Corto Plazo de Chilean Journal of Agricultural Research: Un Análisis Bibliométrico

    Get PDF
    Indexación: Web of Science; ScieloIn January 2007, the Chilean Journal of Agricultural Research was indexed by the Institute of Scientific Information (ISI). This paper reviews the research that has been published since 2007 by using records extracted from the Web of Science database. The papers published were mostly affiliated to researchers from Chile, and six out of the ten most-contributing countries were from Latin America. The analysis by institutions showed Universidad de Concepcion as the most prolific, although this result is not valid. A lack of standardization in the manner the Instituto de Investigaciones Agropecuarias (INIA) subscribed its address on each paper caused a disaggregation of the information. This was proven by the manual curation of each record that was affiliated to any of the centers belonging to INIA. The journal has a self-citation rate of 19.3%, value that is relatively high if compared to other journals from the same subject category listed on The Journal Citation Reports 2010. Finally, this work should be considered a bibliometric snapshot of the current situation of the journal that will serve as a benchmark when new evaluations are made in a few-years time.En enero 2007, la revista Chilean Journal of Agricultural Research fue indexada por el Institute of Scientific Information (ISI), por lo que este artículo analiza la investigación que ha sido publicada desde el ano 2007 utilizando los registros extraídos de la base de datos Web of Science. Los artículos publicados fueron mayormente afiliados a investigadores de Chile, siendo seis de los 10 países que más contribuyeron en artículos de Latinoamérica. Un análisis por institución mostró a la Universidad de Concepción como la más prolífica, aunque este resultado no es válido por una ausencia de estandarización en la manera que el Instituto de Investigaciones Agropecuarias (INIA) suscribe su dirección en cada artículo, lo que provoca disgregación de la información. Esto se verificó mediante la curación manual de cada registro que se encontraba afiliado a cualquiera de los centros que pertenecen a INIA. Esta revista tiene un grado de auto-citas de 19.3%, valor relativamente alto si se compara con revistas de la misma categoría de tópico que se encuentran listadas en Journal Citation Report 2010. Finalmente, este trabajo debe ser considerado como una visión bibliométrica de la situación actual de la revista que servirá como línea base para nuevas evaluaciones a realizarse en pocos anos más.http://ref.scielo.org/ff8v7

    High-ranked social science journal articles can be identified from early citation information

    Get PDF
    Do citations accumulate too slowly in the social sciences to be used to assess the quality of recent articles? I investigate whether this is the case using citation data for all articles in economics and political science published in 2006 and indexed in the Web of Science. I find that citations in the first two years after publication explain more than half of the variation in cumulative citations received over a longer period. Journal impact factors improve the correlation between the predicted and actual future ranks of journal articles when using citation data from 2006 alone but the effect declines sharply thereafter. Finally, more than half of the papers in the top 20% in 2012 were already in the top 20% in the year of publication (2006)
    corecore