740 research outputs found

    The IPTS Report No. 40, December 1999

    Get PDF

    Technology Foresight: A Bibliometric Analysis to Identify Leading and Emerging Methods

    Get PDF
    Foresight studies provide essential information used by the government, industry and academia for technology planning and knowledge expansion. They are complicated, resource-intensive, and quite expensive. The approach, methods, and techniques must be carefully identified and selected. Despite the global importance of foresight activities, there are no frameworks to help one develop and plan a proper foresight study. This paper begins Keywords: technology foresight; strategic foresight; adaptive foresight; Social Network Analysis (SNA); bibliometric tools; data mining; text mining. to close this gap by analyzing and comparing different schools of thought and updating the literature with the most current tools and methods. Data mining techniques are used to identify articles through an extensive literature review. Social Network Analysis (SNA) techniques are used to identify and analyze leading journals, articles, and researchers. A framework is developed here to provide a guide to help in the selection of methods and tools for different approaches

    Technology foresight: a bibliometric analysis to identify leading and emerging methods

    Get PDF
    Foresight studies provide essential information used by the government, industry and academia for technology planning and knowledge expansion. They are complicated, resource-intensive, and quite expensive. The approach, methods, and techniques must be carefully identified and selected. Despite the global importance of foresight activities, there are no frameworks to help one develop and plan a proper foresight study. This paper begins to close this gap by analyzing and comparing different schools of thought and updating the literature with the most current tools and methods. Data mining techniques are used to identify articles through an extensive literature review. Social Network Analysis (SNA) techniques are used to identify and analyze leading journals, articles, and researchers. A framework is developed here to provide a guide to help in the selection of methods and tools for different approaches

    The metric tide: report of the independent review of the role of metrics in research assessment and management

    Get PDF
    This report presents the findings and recommendations of the Independent Review of the Role of Metrics in Research Assessment and Management. The review was chaired by Professor James Wilsdon, supported by an independent and multidisciplinary group of experts in scientometrics, research funding, research policy, publishing, university management and administration. This review has gone beyond earlier studies to take a deeper look at potential uses and limitations of research metrics and indicators. It has explored the use of metrics across different disciplines, and assessed their potential contribution to the development of research excellence and impact. It has analysed their role in processes of research assessment, including the next cycle of the Research Excellence Framework (REF). It has considered the changing ways in which universities are using quantitative indicators in their management systems, and the growing power of league tables and rankings. And it has considered the negative or unintended effects of metrics on various aspects of research culture. The report starts by tracing the history of metrics in research management and assessment, in the UK and internationally. It looks at the applicability of metrics within different research cultures, compares the peer review system with metric-based alternatives, and considers what balance might be struck between the two. It charts the development of research management systems within institutions, and examines the effects of the growing use of quantitative indicators on different aspects of research culture, including performance management, equality, diversity, interdisciplinarity, and the ‘gaming’ of assessment systems. The review looks at how different funders are using quantitative indicators, and considers their potential role in research and innovation policy. Finally, it examines the role that metrics played in REF2014, and outlines scenarios for their contribution to future exercises

    Citation Analysis: A Comparison of Google Scholar, Scopus, and Web of Science

    Get PDF
    When faculty members are evaluated, they are judged in part by the impact and quality of their scholarly publications. While all academic institutions look to publication counts and venues as well as the subjective opinions of peers, many hiring, tenure, and promotion committees also rely on citation analysis to obtain a more objective assessment of an author’s work. Consequently, faculty members try to identify as many citations to their published works as possible to provide a comprehensive assessment of their publication impact on the scholarly and professional communities. The Institute for Scientific Information’s (ISI) citation databases, which are widely used as a starting point if not the only source for locating citations, have several limitations that may leave gaps in the coverage of citations to an author’s work. This paper presents a case study comparing citations found in Scopus and Google Scholar with those found in Web of Science (the portal used to search the three ISI citation databases) for items published by two Library and Information Science full-time faculty members. In addition, the paper presents a brief overview of a prototype system called CiteSearch, which analyzes combined data from multiple citation databases to produce citation-based quality evaluation measures

    How should peer-review panels behave?

    Get PDF
    Many governments wish to assess the quality of their universities. A prominent example is the UK’s new Research Excellence Framework (REF) 2014. In the REF, peer-review panels will be provided with information on publications and citations. This paper suggests a way in which panels could choose the weights to attach to these two indicators. The analysis draws in an intuitive way on the concept of Bayesian updating (where citations gradually reveal information about the initially imperfectly-observed importance of the research). Our study should not be interpreted as the argument that only mechanistic measures ought to be used in a REF

    Reviewers' ratings and bibliometric indicators: hand in hand when assessing over research proposals?

    Full text link
    The peer review system has been traditionally challenged due to its many limitations especially for allocating funding. Bibliometric indicators may well present themselves as a complement. Objective: We analyze the relationship between peers' ratings and bibliometric indicators for Spanish researchers in the 2007 National R&D Plan for 23 research fields. We analyze peers' ratings for 2333 applications. We also gathered principal investigators' research output and impact and studied the differences between accepted and rejected applications. We used the Web of Science database and focused on the 2002-2006 period. First, we analyzed the distribution of granted and rejected proposals considering a given set of bibliometric indicators to test if there are significant differences. Then, we applied a multiple logistic regression analysis to determine if bibliometric indicators can explain by themselves the concession of grant proposals. 63.4% of the applications were funded. Bibliometric indicators for accepted proposals showed a better previous performance than for those rejected; however the correlation between peer review and bibliometric indicators is very heterogeneous among most areas. The logistic regression analysis showed that the main bibliometric indicators that explain the granting of research proposals in most cases are the output (number of published articles) and the number of papers published in journals that belong to the first quartile ranking of the Journal Citations Report. Bibliometric indicators predict the concession of grant proposals at least as well as peer ratings. Social Sciences and Education are the only areas where no relation was found, although this may be due to the limitations of the Web of Science's coverage. These findings encourage the use of bibliometric indicators as a complement to peer review in most of the analyzed areas
    • 

    corecore