7,049 research outputs found

    A categorization of arguments for counting methods for publication and citation indicators

    Get PDF
    Most publication and citation indicators are based on datasets with multi-authored publications and thus a change in counting method will often change the value of an indicator. Therefore it is important to know why a specific counting method has been applied. I have identified arguments for counting methods in a sample of 32 bibliometric studies published in 2016 and compared the result with discussions of arguments for counting methods in three older studies. Based on the underlying logics of the arguments I have arranged the arguments in four groups. Group 1 focuses on arguments related to what an indicator measures, Group 2 on the additivity of a counting method, Group 3 on pragmatic reasons for the choice of counting method, and Group 4 on an indicator's influence on the research community or how it is perceived by researchers. This categorization can be used to describe and discuss how bibliometric studies with publication and citation indicators argue for counting methods

    Information Metrics (iMetrics): A Research Specialty with a Socio-Cognitive Identity?

    Full text link
    "Bibliometrics", "scientometrics", "informetrics", and "webometrics" can all be considered as manifestations of a single research area with similar objectives and methods, which we call "information metrics" or iMetrics. This study explores the cognitive and social distinctness of iMetrics with respect to the general information science (IS), focusing on a core of researchers, shared vocabulary and literature/knowledge base. Our analysis investigates the similarities and differences between four document sets. The document sets are drawn from three core journals for iMetrics research (Scientometrics, Journal of the American Society for Information Science and Technology, and Journal of Informetrics). We split JASIST into document sets containing iMetrics and general IS articles. The volume of publications in this representation of the specialty has increased rapidly during the last decade. A core of researchers that predominantly focus on iMetrics topics can thus be identified. This core group has developed a shared vocabulary as exhibited in high similarity of title words and one that shares a knowledge base. The research front of this field moves faster than the research front of information science in general, bringing it closer to Price's dream.Comment: Accepted for publication in Scientometric

    Utilising content marketing metrics and social networks for academic visibility

    Get PDF
    There are numerous assumptions on research evaluation in terms of quality and relevance of academic contributions. Researchers are becoming increasingly acquainted with bibliometric indicators, including; citation analysis, impact factor, h-index, webometrics and academic social networking sites. In this light, this chapter presents a review of these concepts as it considers relevant theoretical underpinnings that are related to the content marketing of scholars. Therefore, this contribution critically evaluates previous papers that revolve on the subject of academic reputation as it deliberates on the individual researchers’ personal branding. It also explains how metrics are currently being used to rank the academic standing of journals as well as higher educational institutions. In a nutshell, this chapter implies that the scholarly impact depends on a number of factors including accessibility of publications, peer review of academic work as well as social networking among scholars.peer-reviewe

    Bias in the journal impact factor

    Full text link
    The ISI journal impact factor (JIF) is based on a sample that may represent half the whole-of-life citations to some journals, but a small fraction (<10%) of the citations accruing to other journals. This disproportionate sampling means that the JIF provides a misleading indication of the true impact of journals, biased in favour of journals that have a rapid rather than a prolonged impact. Many journals exhibit a consistent pattern of citation accrual from year to year, so it may be possible to adjust the JIF to provide a more reliable indication of a journal's impact.Comment: 9 pages, 8 figures; one reference correcte

    Studying Relationship between Citation and Altmetrics of Top Chemistry Researches’ Articles

    Get PDF
    Abstract: The main objective of the present research is to examine the relationship between the number of citations and the level of altmetrics for testing the validity of these new metrics, at least in terms of being alignment with the test established index. The present research population consist of articles from the top chemistry writers that were profiled at the Scopus Citation Database in 2010. Sample research is the articles by 20 top author. The present research is applied in terms of purpose, and is descriptive and correlative in terms of data collection. Data extraction was performed using Webometric analyst software and citation data was collected from Scopus. SPSS software was used to analyze the data. The research findings show that the articles in question have little presence on social networks. In terms of the amount of attendance and distribution Mendeley, CiteUlike, Twitter, Facebook, Blogs, Google Plus and News, had the largest number of articles and altmetrics respectively. Also, the results show that Mendeley and Twitter have the most relationship with citations. Also, articles have at least one higher citation average altmetric (25.14%) than those with no altmetric (7.58%). In terms of citations\u27 relationship, the Spearman correlation test showed a strong correlation between the number of Mendeley readers, news, and citations. Also, there was a weak correlation between Twitter, CiteUlike, and citations. Finally, there was not a meaningful relationship between Facebook posts, blog posts, Google plus, and citations

    The development of computer science research in the People's Republic of China 2000-2009: A bibliometric study

    No full text
    This paper reports a bibliometric study of the development of computer science research in the People's Republic of China in the 21st century, using data from the Web of Science, Journal Citation Reports and CORE databases. Focusing on the areas of data mining, operating systems and web design, it is shown that whilst the productivity of Chinese research has risen dramatically over the period under review, its impact is still low when compared with established scientific nations such as the USA, the UK and Japan. The publication and citation data for China are compared with corresponding data for the other three BRIC nations (Brazil, Russian and India). It is shown that China dominates the BRIC nations in terms of both publications and citations, but that Indian publications often have a greater individual impact. © The Author(s) 2012
    corecore