61 research outputs found

    Scholar Metrics: el impacto de las revistas según Google, ¿un divertimento o un producto científico aceptable?

    Get PDF
    This paper reviews the most significant features of Google Scholar Metrics, pointing out their strengths and weaknesses, and discusses the possibilities of adoption for evaluating journals

    Las revistas españolas de Ciencias Sociales y Jurídicas en Google Scholar Metrics ¿están todas las que son?

    Get PDF
    New paper of EC3 about Google Scholar Metrics. The coverage of Spanish relevant journals in Social Sciences and Law is assessed. We conclude that Google Scholar Metrics shouldn't be used to evaluate national journals in these areas because of its limitations

    Google Scholar Metrics: una herramienta poco fiable para la evaluación de revistas científicas

    Get PDF
    We introduce Google Scholar Metrics (GSM), a new bibliometric product of Google that aims at providing the H-index for scientific journals and other information sources. We conduct a critical review of GSM showing its main characteristics and possibilities as a tool for scientific evaluation. We discuss its coverage along with the inclusion of repositories, bibliographic control, and its options for browsing and searching. We conclude that, despite Google Scholar’s value as a source for scien- tific assessment, GSM is an immature product with many shortcomings, and therefore we advise against its use for evalu- ation purposes. However, the improvement of these shortcomings would place GSM as a serious competitor to the other existing products for evaluating scientific journals.Se presenta Google Scholar Metrics (GSM), el nuevo producto bibliométrico de Google, que computa el índice h de revistas y otras fuentes de información científica. Se exponen las principales características de GSM, y se realiza una revisión crítica de sus posibilidades como herramienta para la evaluación de revistas científicas. Se estudia, entre otros aspectos, su cobertura, la inclusión de repositorios junto a las revistas científicas, el control bibliográfico de la información, y las posibilidades de consulta y visualización de resultados. Se concluye que, pese a las potencialidades de Google Scholar como fuente para la evaluación científica, GSM es un producto inmaduro y con múltiples limitaciones por lo que no se aconseja su uso con fines evaluativos. Igualmente se plantea que la mejora de sus prestaciones, posicionaría a GSM como una seria competencia para los productos de evaluación de revistas existentes en el mercado de la información científica

    Tracking the performance of an R&D programme in the biomedical sciences

    Get PDF
    Pre-refereed version of manuscript accepted for publication in Research EvaluatioN: arXiv:1602.04049This article aims at offering an evaluation framework of an R&D programme in the biomedical sciences. It showcases the Spanish Biomedical Research Networking Centres (CIBER) initiative as an example of the effect of research policy management on performance. For this, it focuses on three specific aspects: its role on the national research output in the biomedical sciences, its effect on promoting translational research through internal collaboration between research groups, and the perception of researchers on the programme as defined by their inclusion of their CIBER centres in the address field. Research output derived from this programme represents around 25% of the country’s publications in the biomedical fields. After analysing a 7-year period, the programme has enhanced collaborations between its members, but they do not seem to be sufficiently strong. 54.5% of the publications mentioned this programme in their address; however, an increase in the share of papers mention is observed 2 years after it was launched. We suggest that by finding the point at which the share of mentions stabilizes may be a good strategy to identify the complete fulfilment of these types of R&D policies.This work was supported by the Spanish Fondo de Investigacion Sanitaria (FIS) [PI10/01122].Peer reviewe

    Productividad e impacto de los investigadores españoles: umbrales de referencia por áreas científicas

    Get PDF
    Reference thresholds for the scientifi c production and impact of internationally visible Spanish research within the areas defi ned by the Spanish National Agency for Evaluation and Prospective (ANEP) are presented. These percentile reference tables are constructed from the population of researchers who applied for a project within Spain’s National R & D Plan 2007 (n = 3.356) and are to serve as benchmarks, permitting comparisons between researchers’ bibliometric behavior and mean performance in their respective scientifi c disciplines. Data relating to mean production, impact and visibility for each ANEP area are also presented. The internationalization of these areas between 2000 and 2006 is discussed, with special emphasis on the Social Sciences. Finally, we suggest funding agencies and research institutions use these reference thresholds as assessment tools in their selection processes.Se presentan umbrales de referencia de producción e impacto científi co de la investigación española con visibilidad internacional para las áreas defi nidas por la Agencia Nacional de Evaluación y Prospectiva (ANEP) en sus convocatorias. Tomando como población los solicitantes de proyectos del Plan Nacional de I + D 2007 (n = 3.356) se construyen tablas de referencia por percentiles que funcionan a modo de benchmarks, permitiendo efectuar comparaciones entre el comportamiento bibliométrico de un investigador y los registros de referencia en su área científi ca. Igualmente se ofrecen los datos de producción, impacto y visibilidad promedios para las áreas ANEP, y se discute el proceso de internacionalización de dichas áreas en el período 2000-2006

    Altmetrics: not everything that can be counted, counts

    Get PDF
    En 2012 se multiplicaron las propuestas sobre nuevos indicadores asociados a aplicaciones de la web social, denominados altmetrics (o métricas alternativas) y se presentan como una alternativa a la evaluación de la actividad científica. Sin embargo, pese a las múltiples propuestas, dicho campo aún está en fase embrionaria. Se presentan algunas posibles limitaciones de los nuevos indicadores: 1) gran número de fuentes de información e indicadores que hacen difícil establecer su clasificación y relevancia; 2) muchos no tienen demasiada validez estadística por los pobres resultados que generan; 3) es difícil determinar cuál es el significado de las nuevas medidas ya que no sabemos si se mide exactamente impacto social o científico; y 4) carácter evanescente y efímero.In 2012 there were many proposals for new indicators associated with social web tools. These indicators have been termed as altmetrics (or alternative metrics) and are presented as an alternative to the evaluation of scientific activity. In this paper some possible limitations of the new indicators are presented. More specifically these shortcomings are: 1) a large number of information sources and indicators that make it difficult to establish their classification and relevance; 2) some indicators that have no statistical validity because they generate poor results; 3) the difficulty of determining the meaning of the new metrics –what are we assessing, scientific or social impact?; and 4) the evanescent and ephemeral nature of these new sources and indicators

    Indicators for usage and participation in scientific journals 2.0: the case of PLoS One

    Get PDF
    The new publishing and scientific communication environments have led to the emergence of new Web indicators. Along with usage metrics such as downloads there are many measures that are generated from Science 2.0. Journals published by the Public Library of Science systematically collect many of these new metrics. The objective of this paper is to present some of these new indicators and analyze them quantitatively through the case study of 8945 papers published in the journal PLoS One. The selected indicators were; comments, ratings, number of bookmarks, links from scientific weblogs, downloads views and citations. Basic descriptive statistics indicators and correlations have been calculated for all of them. The results show the low participation of scientists in Web 2.0 and how most of these indicators, except for downloads and visits, are poorly consolidated metrics

    Tools for evaluating science in universities and R&D centres: descriptions and usage

    Get PDF
    Indicators have become essential to Spanish universities. Some of the major funding calls for proposals (Campus of international excellence, Severo Ochoa centres of excellence, etc.) rely heavily on R&D indicators. We review some of the tools that universities have to generate indicators, apart from the traditional citation indexes: 1) scientific information systems, 2) bibliometric suites from commercial companies, and 3) university rankings. The use of these tools and the need for librarians to manage them are discussed

    Google scholar citations and the emergence of new actors in research evaluation

    Get PDF
    Google scholar citations, a system aimed at researchers, attempts to outline researchers’ bibliometric profile, providing citation indicators and the h-index. We review other tools designed to measure visibility and academic impact on the web, such as Microsoft academic search and the new initiatives grouped under the label altmetrics or alternative indicators. Finally, we discuss how the appearance of Google scholar citations and other products may affect the two major sources of bibliometric data, ISI Web of science and Scopus, and how this can influence the evaluation of scientific research

    Cómo publicar en revistas científicas de impacto: consejos y reglas sobre publicación científica

    Get PDF
    Publicar en las denominadas revistas científicas de impacto, identificadas como aquellas indexadas en las bases de datos de Thomson-Reuters, se ha convertido en el objetivo principal de investigadores e instituciones de I+D. Por ello en este trabajo se presentan algunos consejos para maximizar las posibilidades de aceptación de los manuscritos enviados a este tipo de revistas. En primer lugar definimos qué es una revista de impacto y sus beneficios tanto para investigadores como instituciones. A continuación desarrollamos algunos aspectos a considerar durante la preparación del manuscrito como la autoría, la elaboración de tablas y gráficas o la preparación de referencias bibliográficas. Una vez elaborado el manuscrito nos concentramos en los criterios fundamentales para seleccionar adecuadamente la revista. Por último se repasan diferentes factores a tener en cuenta durante el proceso de envío para, una vez enviado, centrarnos en el proceso de revisión por pares y la respuesta a los revisores
    corecore