721,790 research outputs found

    Evaluation of the impact of National Breast Cancer Foundation-funded research

    Get PDF
    © Copyright 2014. The Medical Journal of Australia - reproduced with permission.Objective: To evaluate the impact of the National Breast Cancer Foundation’s (NBCF’s) research investment. Design and participants: Surveys based on the Payback Framework were sent to chief investigators involved in research funded by the NBCF during 1995–2012; a bibliometric analysis of NBCF-funded publications in 2006–2010 was conducted; and a purposive, stratified sample of case studies was obtained. Main outcome measures: Research impact on knowledge production, the research system, informing policy, product development and broader health and economic benefits. Results: Of 242 surveys sent, 153 (63%) were returned. The average impact of journals in which NBCF publications appeared was double that of world publications. Seventy surveys (46%) reported career progression, and 185 higher degrees were obtained or expected, including 121 PhDs. One hundred and one grants (66%) produced tools that built capacity across the research system, and research teams leveraged an additional $1.40 in funding for every dollar invested. Fifteen applied grants and one basic grant impacted on policy. Ten basic and four applied grants led to the development of drugs, prognostic tools or diagnostic technologies. Twenty applied and two basic grants led to changes in practice and behaviour of health care staff, consumers and the public, with further impacts anticipated. Case studies provided illustrations of high impact. Conclusions: NBCF’s strategy of investing in a mixed portfolio of research areas and mechanisms encouraged a broad range of impacts across all Payback categories. The impacts from basic research tended to focus on knowledge production and drug development; while applied research generated greater impacts within the other Payback categories. The funding of shared infrastructure stimulated impact across the research system

    An evaluation resource for geographic information retrieval

    Get PDF
    In this paper we present an evaluation resource for geographic information retrieval developed within the Cross Language Evaluation Forum (CLEF). The GeoCLEF track is dedicated to the evaluation of geographic information retrieval systems. The resource encompasses more than 600,000 documents, 75 topics so far, and more than 100,000 relevance judgments for these topics. Geographic information retrieval requires an evaluation resource which represents realistic information needs and which is geographically challenging. Some experimental results and analysis are reported

    Qualitative analysis of academic group and discussion forum on Facebook

    Get PDF
    In the present study, data was triangulated and two methods of data analysis were used. Qualitative analysis was undertaken of free-text data from students’ reflective essaysto extract socially-related themes. Heuristic evaluation was conducted by expert evaluators, who investigated forum contributions and discourse in line with contemporary learning theory and considered the social\ud culture of participation. Findings of the qualitative analysis of students’ perceptions and results of the\ud heuristic evaluation of forum participation confirmed each other, indicating a warm social climate and a conducive, well-facilitated environment that supported individual styles of participation. It fostered interpersonal relationships between distance learners, as well as study-related benefits enhanced by peer teaching and insights acquired in a culture of social negotiation. The environment was effectively moderated, while supporting student-initiative.\u

    The Eurovision St Andrews collection of photographs

    Get PDF
    This report describes the Eurovision image collection compiled for the ImageCLEF (Cross Language Evaluation Forum) evaluation exercise. The image collection consists of around 30,000 photographs from the collection provided by the University of St Andrews Library. The construction and composition of this unique image collection are described, together with the necessary information to obtain and use the image collection

    A reproducible approach with R markdown to automatic classification of medical certificates in French

    Get PDF
    In this paper, we report the ongoing developments of our first participation to the Cross-Language Evaluation Forum (CLEF) eHealth Task 1: “Multilingual Information Extraction - ICD10 coding” (NĂ©vĂ©ol et al., 2017). The task consists in labelling death certificates, in French with international standard codes. In particular, we wanted to accomplish the goal of the ‘Replication track’ of this Task which promotes the sharing of tools and the dissemination of solid, reproducible results.In questo articolo presentiamo gli sviluppi del lavoro iniziato con la partecipazione al Laboratorio CrossLanguage Evaluation Forum (CLEF) eHealth denominato: “Multilingual Information Extraction - ICD10 coding” (NĂ©vĂ©ol et al., 2017) che ha come obiettivo quello di classificare certificati di morte in lingua francese con dei codici standard internazionali. In particolare, abbiamo come obiettivo quello proposto dalla ‘Replication track’ di questo Task, che promuove la condivisione di strumenti e la diffusione di risultati riproducibili

    Evaluation of MIRACLE approach results for CLEF 2003

    Get PDF
    This paper describes MIRACLE (Multilingual Information RetrievAl for the CLEf campaign) approach and results for the mono, bi and multilingual Cross Language Evaluation Forum tasks. The approach is based on the combination of linguistic and statistic techniques to perform indexing and retrieval tasks

    Multilingual log analysis: LogCLEF

    Get PDF
    The current lack of recent and long-term query logs makes the verifiability and repeatability of log analysis experiments very limited. A first attempt in this direction has been made within the Cross-Language Evaluation Forum in 2009 in a track named LogCLEF which aims to stimulate research on user behaviour in multilingual environments and promote standard evaluation collections of log data. We report on similarities and differences of the most recent activities for LogCLEF

    The Canadian Forum on Civil Justice: Project Evaluation: Final Report

    Get PDF
    The Department of Justice has funded the Canadian Forum on Civil Justice (CFCJ, or the Forum) in its start-up phase, from 1998/99 to 2000/01. The Forum is now seeking funding for its on-going operations. As part of its decision-making process, the Department determined that an evaluation of the Forum’s activities to-date would be desirable. This evaluation was therefore undertaken to determine whether the Forum has added value to the Canadian civil justice community. This report describes the Forum and its activities, and presents answers to the evaluation questions addressed

    Challenges to evaluation of multilingual geographic information retrieval in GeoCLEF

    Get PDF
    This is the third year of the evaluation of geographic information retrieval (GeoCLEF) within the Cross-Language Evaluation Forum (CLEF). GeoCLEF 2006 presented topics and documents in four languages (English, German, Portuguese and Spanish). After two years of evaluation we are beginning to understand the challenges to both Geographic Information Retrieval from text and of evaluation of the results of geographic information retrieval. This poster enumerates some of these challenges to evaluation and comments on the limitations encountered in the first two evaluations
    • 

    corecore