4 research outputs found

    Pattern of use and characteristic features of web citations in Scholarly journals

    Get PDF
    The present study examines the reliability of web citations in scholarly journals of Library and Information Science and Communication and Media Studies. The journals were selected based on their high impact factor published between 2008 and 2017. A total 5,55,428 references were extracted, out of which 1,02,718 had web citations. The research findings indicated that there more number of URLs in CMS journal articles than in LIS journal articles. While examining the characteristic features of the URLs cited, it was found that .html files and organizational (.org) top-level domain were the most cited in both the disciplines. Moreover, URLs with path depth 2 and character length 41-50 were frequently cited in both the disciplines. The display and redirect URLs associated with the characteristic features are also determined in this study

    Citation Autobiography: An Investigation of ISI Database Coverage in Determining Author Citedness

    Get PDF
    This article presents a case study investigating the coverage complete- ness of the Institute for Scientific Information鈥檚 citation data for specific authors, based on analysis of this author鈥檚 lifetime citation record, which was compiled through the ISI database, searching the literature for nearly fifteen years, and through various Web search engines. It was found that (with self-citations disregarded) the ISI captured 28.8 percent of the total citations, 42.2 percent of print citations, 20.3 percent of citations from outside the United States, and 2.3 percent of non-English citations. The definition and classification of Web citations are discussed. It is suggested that librarians and faculty should not rely solely on ISI author citation counts, especially when demonstration of international impact is important

    Evaluating the online impact of reporting guidelines for randomised trial reports and protocols: a cross-sectional web-based data analysis of CONSORT and SPIRIT initiatives

    Get PDF
    Reporting guidelines are tools to help improve the transparency, completeness, and clarity of published articles in health research. Specifically, the CONSORT (Consolidated Standards of Reporting Trials) and SPIRIT (Standard Protocol Items: Recommendations for Interventional Trials) statements provide evidence-based guidance on what to include in randomised trial articles and protocols to guarantee the efficacy of interventions. These guidelines are subsequently described and discussed in journal articles and used to produce checklists. Determining the online impact (i.e., number and type of links received) of these articles can provide insights into the dissemination of reporting guidelines in broader environments (web-at-large) than simply that of the scientific publications that cite them. To address the technical limitations of link analysis, here the Debug-Validate-Access-Find (DVAF) method is designed and implemented to measure different facets of the guidelines' online impact. A total of 65 articles related to 38 reporting guidelines are taken as a baseline, providing 240,128 URL citations, which are then refined, analysed, and categorised using the DVAF method. A total of 15,582 links to journal articles related to the CONSORT and SPIRIT initiatives were identified. CONSORT 2010 and SPIRIT 2013 were the reporting guidelines that received most links (URL citations) from other online objects (5328 and 2190, respectively). Overall, the online impact obtained is scattered (URL citations are received by different article URL IDs, mainly from link-based DOIs), narrow (limited number of linking domain names, half of articles are linked from fewer than 29 domain names), concentrated (links come from just a few academic publishers, around 60% from publishers), non-reputed (84% of links come from dubious websites and fake domain names) and highly decayed (89% of linking domain names were not accessible at the time of the analysis). In light of these results, it is concluded that the online impact of these guidelines could be improved, and a set of recommendations are proposed to this end.Open Access funding provided thanks to the CRUE-CSIC agreement with Springer Nature.S

    Evaluaci贸n de la actividad cient铆fica en ciencia de la informaci贸n a partir de indicadores bibliom茅tricos y altm茅tricos

    Get PDF
    La presente investigaci贸n es un an谩lisis de la producci贸n cient铆fica en Ciencia de la Informacion (CI), fundamentada en el contexto epistemol贸gico e hist贸rico de la disciplina, para identificar las tendencias de uso de la informaci贸n en plataformas de publicaci贸n formales e informales. A partir de la implementaci贸n de indicadores bibliom茅tricos e indicadores alternativos, se pretende establecer. 驴Como la integraci贸n de indicadores altim茅tricos en la evaluaci贸n cient铆fica, posibilita la identificaci贸n de tendencias en la investigaci贸n disciplinar? Y si es valido afirmar, que la altmetr铆a es una herramienta confiable y 煤til para la evaluaci贸n de los dominios cient铆ficos. Se toma como referente la producci贸n visible en Web of Science durante el periodo 2012- 2016, para identificar las din谩micas cient铆ficas de investigaci贸n en la CI, a partir de una muestra de 1224 registros en los cuales se utilizan indicadores bibliometricos de producci贸n, citaci贸n o impacto e indicadores altim茅tricos recuperados de las plataformas ResearchGate (RG) y Plum Analytics (PlumX). Los resultados evidencian que los indicadores alternativos aun est谩n en periodo de desarrollo y necesitan normalizaci贸n; de lo cual se concluye, que la evaluaci贸n cient铆fica requiere la complementaci贸n de modelos m茅tricos cl谩sicos junto a m茅tricas alternativas que permitan identificar las din谩micas sociales y de comunicaci贸n que se generan en la comunidad cient铆fica m谩s all谩 del impacto y la citaci贸n.This research is an analysis of the scientific activity in Information Science (CI), based on the epistemological and historical context of the discipline, to identify trends in the use of information in formal and informal publishing platforms. Based on the implementation of bibliometric iand alternative indicators, it is intended to establish: How does the integration of altmetric indicators in scientific evaluation make it possible to identify trends in disciplinary research? And, if it is valid to say that altmetrics is a reliable and useful tool for the scientific evaluation of scientific domains. Visible production in Web of Science during the 2012-2016 period is taken as a reference to identify the scientific dynamics of research in the CI, from a sample of 1224 records in which bibliometric indicators of production, citation or impact and altmetric indicators recovered from the ResearchGate (RG) and Plum Analytics (PlumX) platforms are used. The results show that the alternative indicators are still under development and need to be standardized; from which it is concluded that scientific evaluation requires the complementing of classical metric models with alternative metrics that allow identifying the social and communication dynamics generated in the scientific community beyond the impact and citation.Profesional en Ciencia de la Informaci贸n - Bibliotec贸logo (a)Pregrad
    corecore