23 research outputs found

    The use of evidence in public governmental reports on health policy: an analysis of 17 Norwegian official reports (NOU)

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Governments increasingly require policy documents to be evidence-based. This paper analyses the use of scientific evidence in such documents by reviewing reports from government-appointed committees in Norway to assess the committees' handling of questions of effect.</p> <p>Methods</p> <p>This study uses the 'Index of Scientific Quality' (ISQ) to analyse all Norwegian official reports (NOUs) that were: (1) published by the Norwegian Ministry of Health and Care Services during 1994-1998 (N = 20); and (2) concerned with questions of effect either because these were included in the mandate or as a result of the committee's interpretation of the mandate. The ISQ is based on scientific criteria common in all research concerning questions of effect. The primary outcome measure is an ISQ score on a five-point scale.</p> <p>Results</p> <p>Three reports were excluded because their mandates, or the committees' interpretations of them, did not address questions of effect. For the remaining 17 NOUs in our study, overall ISQ scores were low for systematic literature search and for explicit validation of research. Two reports had an average score of three or higher, while scores for five other reports were not far behind. How committees assessed the relevant factors was often unclear.</p> <p>Conclusion</p> <p>The reports' evaluations of health evidence in relation to questions of effect lacked transparency and, overall, showed little use of systematic processes. A systematic, explicit and transparent approach, following the standards laid down in the ISQ, may help generate the evidence-based decision-making that Norway, the UK, the EU and the WHO desire and seek. However, policy-makers may find the ISQ criteria for assessing the scientific quality of a report too narrow to adequately inform policy-making.</p

    Efficacy of a training intervention on the quality of practitioners' decision support for patients deciding about place of care at the end of life: A randomized control trial: Study protocol

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Most people prefer home palliation but die in an institution. Some experience decisional conflict when weighing options regarding place of care. Clinicians can identify patients' decisional needs and provide decision support, yet generally lack skills and confidence in doing so. This study aims to determine whether the quality of clinicians' decision support can be improved with a brief, theory-based, skills-building intervention.</p> <p>Theory</p> <p>The Ottawa Decision Support Framework (ODSF) guides an evidence based, practical approach to assist clinicians in providing high-quality decision support. The ODSF proposes that decisional needs [personal uncertainty, knowledge, values clarity, support, personal characteristics] strongly influence the quality of decisions patients make. Clinicians can improve decision quality by providing decision support to address decisional needs [clarify decisional needs, provide facts and probabilities, clarify values, support/guide deliberation, monitor/facilitate progress].</p> <p>Methods/Design</p> <p>The efficacy of a brief education intervention will be assessed in a two-phase study. In phase one a focused needs assessment will be conducted with key informants. Phase two is a randomized control trial where clinicians will be randomly allocated to an intervention or control group. The intervention, informed by the needs assessment, knowledge transfer best practices and the ODSF, comprises an online tutorial; an interactive skills building workshop; a decision support protocol; performance feedback, and educational outreach. Participants will be assessed: a) at baseline (quality of decision support); b) after the tutorial (knowledge); and c) four weeks after the other interventions (quality of decision support, intention to incorporate decision support into practice and perceived usefulness of intervention components). Between group differences in the primary outcome (quality of decision support scores) will be analyzed using ANOVA.</p> <p>Discussion</p> <p>Few studies have investigated the efficacy of an evidence-based, theory guided intervention aimed at assisting clinicians to strengthen their patient decision support skills. Expanding our understanding of how clinicians can best support palliative patients' decision-making will help to inform best practices in patient-centered palliative care. There is potential transferability of lessons learned to other care situations such as chronic condition management, advance directives and anticipatory care planning. Should the efficacy evaluation reveal clear improvements in the quality of decision support provided by clinicians who received the intervention, a larger scale implementation and effectiveness trial will be considered.</p> <p>Trial registration</p> <p>This study is registered as NCT00614003</p

    Evidence-informed health policy 4 – Case descriptions of organizations that support the use of research evidence

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Previous efforts to produce case descriptions have typically not focused on the organizations that produce research evidence and support its use. External evaluations of such organizations have typically not been analyzed as a group to identify the lessons that have emerged across multiple evaluations. Case descriptions offer the potential for capturing the views and experiences of many individuals who are familiar with an organization, including staff, advocates, and critics.</p> <p>Methods</p> <p>We purposively sampled a subgroup of organizations from among those that participated in the second (interview) phase of the study and (once) from among other organizations with which we were familiar. We developed and pilot-tested a case description data collection protocol, and conducted site visits that included both interviews and documentary analyses. Themes were identified from among responses to semi-structured questions using a constant comparative method of analysis. We produced both a brief (one to two pages) written description and a video documentary for each case.</p> <p>Results</p> <p>We conducted 51 interviews as part of the eight site visits. Two organizational strengths were repeatedly cited by individuals participating in the site visits: use of an evidence-based approach (which was identified as being very time-consuming) and existence of a strong relationship between researchers and policymakers (which can be challenged by conflicts of interest). Two organizational weaknesses – a lack of resources and the presence of conflicts of interest – were repeatedly cited by individuals participating in the site visits. Participants offered two main suggestions for the World Health Organization (and other international organizations and networks): 1) mobilize one or more of government support, financial resources, and the participation of both policymakers and researchers; and 2) create knowledge-related global public goods.</p> <p>Conclusion</p> <p>The findings from our case descriptions, the first of their kind, intersect in interesting ways with the messages arising from two systematic reviews of the factors that increase the prospects for research use in policymaking. Strong relationships between researchers and policymakers bodes well given such interactions appear to increase the prospects for research use. The time-consuming nature of an evidence-based approach, on the other hand, suggests the need for more efficient production processes that are 'quick and clean enough.' Our case descriptions and accompanying video documentaries provide a rich description of organizations supporting the use of research evidence, which can be drawn upon by those establishing or leading similar organizations, particularly in low- and middle-income countries.</p

    CONSORT 2010 explanation and elaboration: updated guidelines for reporting parallel group randomised trials.

    Get PDF
    Overwhelming evidence shows the quality of reporting of randomised controlled trials (RCTs) is not optimal. Without transparent reporting, readers cannot judge the reliability and validity of trial findings nor extract information for systematic reviews. Recent methodological analyses indicate that inadequate reporting and design are associated with biased estimates of treatment effects. Such systematic error is seriously damaging to RCTs, which are considered the gold standard for evaluating interventions because of their ability to minimise or avoid bias. A group of scientists and editors developed the CONSORT (Consolidated Standards of Reporting Trials) statement to improve the quality of reporting of RCTs. It was first published in 1996 and updated in 2001. The statement consists of a checklist and flow diagram that authors can use for reporting an RCT. Many leading medical journals and major international editorial groups have endorsed the CONSORT statement. The statement facilitates critical appraisal and interpretation of RCTs. During the 2001 CONSORT revision, it became clear that explanation and elaboration of the principles underlying the CONSORT statement would help investigators and others to write or appraise trial reports. A CONSORT explanation and elaboration article was published in 2001 alongside the 2001 version of the CONSORT statement. After an expert meeting in January 2007, the CONSORT statement has been further revised and is published as the CONSORT 2010 Statement. This update improves the wording and clarity of the previous checklist and incorporates recommendations related to topics that have only recently received recognition, such as selective outcome reporting bias. This explanatory and elaboration document-intended to enhance the use, understanding, and dissemination of the CONSORT statement-has also been extensively revised. It presents the meaning and rationale for each new and updated checklist item providing examples of good reporting and, where possible, references to relevant empirical studies. Several examples of flow diagrams are included. The CONSORT 2010 Statement, this revised explanatory and elaboration document, and the associated website (www.consort-statement.org) should be helpful resources to improve reporting of randomised trials
    corecore