1,038 research outputs found

    Informetrics Education in Library and Information Science (LIS) Academic Department in South Africa

    Get PDF
    The purpose of this paper is to explore literature on informetrics education globally, in order to determine the relevance of informetrics education in South Africa. This paper is based on the literature review on informetrics education in the field of LIS worldwide. The paper addresses the status of informetrics education; extent and levels at which informetrics education is offered; teaching methods for informetrics education; and, challenges associated with informetrics education. The literature reveals that there are 32 countries that offer informetrics education within the field of LIS worldwide. The informetrics education is commonly offered to both under-graduate and post-graduate students. For undergraduates, it is generally offered as an elective course. Commonly, the course content consists of laws and theories, link analysis, resource allocation, methods and applications, innovation and forecasting. The lecture method (face-to-face) of teaching is commonly used. There is a variation of course names from department to department, such as Informetrics, Bibliometrics, Scientometrics, etc. Challenges associated with informetrics education were revealed, including: teaching capacity, student preparedness and ICT support. This paper noted the limitation of informetrics education locally and globally, and recommends more awareness creation, curricula development, short courses and awareness of global trends. Theoretically, the paper will add to the body of literature within the field of LIS. It will offer a vivid characterization of informetrics and demonstrate the importance of its education. Practically, this paper provides a prolific centre of knowledge sharing among LIS departments concerning informetrics education. Through a good attention given to informetrics education, the research evaluation in various fields will attain utmost quality and objectivity

    The development of computer science research in the People's Republic of China 2000-2009: A bibliometric study

    No full text
    This paper reports a bibliometric study of the development of computer science research in the People's Republic of China in the 21st century, using data from the Web of Science, Journal Citation Reports and CORE databases. Focusing on the areas of data mining, operating systems and web design, it is shown that whilst the productivity of Chinese research has risen dramatically over the period under review, its impact is still low when compared with established scientific nations such as the USA, the UK and Japan. The publication and citation data for China are compared with corresponding data for the other three BRIC nations (Brazil, Russian and India). It is shown that China dominates the BRIC nations in terms of both publications and citations, but that Indian publications often have a greater individual impact. © The Author(s) 2012

    Ranking forestry journals using the h-index

    Full text link
    An expert ranking of forestry journals was compared with journal impact factors and h-indices computed from the ISI Web of Science and internet-based data. Citations reported by Google Scholar appear to offer the most efficient way to rank all journals objectively, in a manner consistent with other indicators. This h-index exhibited a high correlation with the journal impact factor (r=0.92), but is not confined to journals selected by any particular commercial provider. A ranking of 180 forestry journals is presented, on the basis of this index.Comment: 21 pages, 3 figures, 5 tables. New table added in response to reviewer comment

    Does it Matter Which Citation Tool is Used to Compare the h-index of a Group of Highly Cited Researchers?

    Get PDF
    h-index retrieved by citation indexes (Scopus, Google scholar, and Web of Science) is used to measure the scientific performance and the research impact studies based on the number of publications and citations of a scientist. It also is easily available and may be used for performance measures of scientists, and for recruitment decisions. The aim of this study is to investigate the difference between the outputs and results from these three citation databases namely Scopus, Google Scholar, and Web of Science based upon the h-index of a group of highly cited researchers (Nobel Prize winner scientist). The purposive sampling method was adopted to collect the required data. The results showed that there is a significant difference in the h-index between three citation indexes of Scopus, Google scholar, and Web of Science; the Google scholar h-index was more than the h-index in two other databases. It was also concluded that there is a significant positive relationship between h-indices based on Google scholar and Scopus. The citation indexes of Scopus, Google scholar, and Web of Science may be useful for evaluating h-index of scientists but they have some limitations as well.Cite as: Farhadi, H., Salehi, H., Yunus, M. M., Aghaei Chadegani, A., Farhadi, M., Fooladi, M., & Ale Ebrahim, N. (2013). Does it Matter Which Citation Tool is Used to Compare the h-index of a Group of Highly Cited Researchers? Australian Journal of Basic and Applied Sciences, 7(4), 198-202. doi: arXiv:1306.072

    Information Metrics (iMetrics): A Research Specialty with a Socio-Cognitive Identity?

    Full text link
    "Bibliometrics", "scientometrics", "informetrics", and "webometrics" can all be considered as manifestations of a single research area with similar objectives and methods, which we call "information metrics" or iMetrics. This study explores the cognitive and social distinctness of iMetrics with respect to the general information science (IS), focusing on a core of researchers, shared vocabulary and literature/knowledge base. Our analysis investigates the similarities and differences between four document sets. The document sets are drawn from three core journals for iMetrics research (Scientometrics, Journal of the American Society for Information Science and Technology, and Journal of Informetrics). We split JASIST into document sets containing iMetrics and general IS articles. The volume of publications in this representation of the specialty has increased rapidly during the last decade. A core of researchers that predominantly focus on iMetrics topics can thus be identified. This core group has developed a shared vocabulary as exhibited in high similarity of title words and one that shares a knowledge base. The research front of this field moves faster than the research front of information science in general, bringing it closer to Price's dream.Comment: Accepted for publication in Scientometric

    Does it Matter Which Citation Tool is Used to Compare the h-index of a Group of Highly Cited Researchers?

    Get PDF
    h-index retrieved by citation indexes (Scopus, Google scholar, and Web of Science) is used to measure the scientific performance and the research impact studies based on the number of publications and citations of a scientist. It also is easily available and may be used for performance measures of scientists, and for recruitment decisions. The aim of this study is to investigate the difference between the outputs and results from these three citation databases namely Scopus, Google Scholar, and Web of Science based upon the h-index of a group of highly cited researchers (Nobel Prize winner scientist). The purposive sampling method was adopted to collect the required data. The results showed that there is a significant difference in the h-index between three citation indexes of Scopus, Google scholar, and Web of Science; the Google scholar h-index was more than the h-index in two other databases. It was also concluded that there is a significant positive relationship between h-indices based on Google scholar and Scopus. The citation indexes of Scopus, Google scholar, and Web of Science may be useful for evaluating h-index of scientists but they have some limitations as well

    Zipf's law and log-normal distributions in measures of scientific output across fields and institutions: 40 years of Slovenia's research as an example

    Full text link
    Slovenia's Current Research Information System (SICRIS) currently hosts 86,443 publications with citation data from 8,359 researchers working on the whole plethora of social and natural sciences from 1970 till present. Using these data, we show that the citation distributions derived from individual publications have Zipfian properties in that they can be fitted by a power law P(x)xαP(x) \sim x^{-\alpha}, with α\alpha between 2.4 and 3.1 depending on the institution and field of research. Distributions of indexes that quantify the success of researchers rather than individual publications, on the other hand, cannot be associated with a power law. We find that for Egghe's g-index and Hirsch's h-index the log-normal form P(x)exp[alnxb(lnx)2]P(x) \sim \exp[-a\ln x -b(\ln x)^2] applies best, with aa and bb depending moderately on the underlying set of researchers. In special cases, particularly for institutions with a strongly hierarchical constitution and research fields with high self-citation rates, exponential distributions can be observed as well. Both indexes yield distributions with equivalent statistical properties, which is a strong indicator for their consistency and logical connectedness. At the same time, differences in the assessment of citation histories of individual researchers strengthen their importance for properly evaluating the quality and impact of scientific output.Comment: 8 pages, 3 figures; accepted for publication in Journal of Informetrics [supplementary material available at http://www.matjazperc.com/sicris/stats.html

    A review of the literature on citation impact indicators

    Full text link
    Citation impact indicators nowadays play an important role in research evaluation, and consequently these indicators have received a lot of attention in the bibliometric and scientometric literature. This paper provides an in-depth review of the literature on citation impact indicators. First, an overview is given of the literature on bibliographic databases that can be used to calculate citation impact indicators (Web of Science, Scopus, and Google Scholar). Next, selected topics in the literature on citation impact indicators are reviewed in detail. The first topic is the selection of publications and citations to be included in the calculation of citation impact indicators. The second topic is the normalization of citation impact indicators, in particular normalization for field differences. Counting methods for dealing with co-authored publications are the third topic, and citation impact indicators for journals are the last topic. The paper concludes by offering some recommendations for future research
    corecore