757 research outputs found
A Guide for applying a revised version of the PARIHS framework for implementation
<p>Abstract</p> <p>Background</p> <p>Based on a critical synthesis of literature on use of the Promoting Action on Research Implementation in Health Services (PARIHS) framework, revisions and a companion <it>Guide </it>were developed by a group of researchers independent of the original PARIHS team. The purpose of the <it>Guide </it>is to enhance and optimize efforts of researchers using PARIHS in implementation trials and evaluations.</p> <p>Methods</p> <p>Authors used a planned, structured process to organize and synthesize critiques, discussions, and potential recommendations for refinements of the PARIHS framework arising from a systematic review. Using a templated form, each author independently recorded key components for each reviewed paper; that is, study definitions, perceived strengths/limitations of PARIHS, other observations regarding key issues and recommendations regarding needed refinements. After reaching consensus on these key components, the authors summarized the information and developed the <it>Guide</it>.</p> <p>Results</p> <p>A number of revisions, perceived as consistent with the PARIHS framework's general nature and intent, are proposed. The related <it>Guide </it>is composed of a set of reference tools, provided in Additional files. Its core content is built upon the basic elements of PARIHS and current implementation science.</p> <p>Conclusions</p> <p>We invite researchers using PARIHS for targeted evidence-based practice (EBP) implementations with a strong task-orientation to use this <it>Guide </it>as a companion and to apply the revised framework prospectively and comprehensively. Researchers also are encouraged to evaluate its use relative to perceived strengths and issues. Such evaluations and critical reflections regarding PARIHS and our <it>Guide </it>could thereby promote the framework's continued evolution.</p
An organizational framework and strategic implementation for system-level change to enhance research-based practice: QUERI Series
<p>Abstract</p> <p>Background</p> <p>The continuing gap between available evidence and current practice in health care reinforces the need for more effective solutions, in particular related to organizational context. Considerable advances have been made within the U.S. Veterans Health Administration (VA) in systematically implementing evidence into practice. These advances have been achieved through a system-level program focused on collaboration and partnerships among policy makers, clinicians, and researchers.</p> <p>The Quality Enhancement Research Initiative (QUERI) was created to generate research-driven initiatives that directly enhance health care quality within the VA and, simultaneously, contribute to the field of implementation science. This paradigm-shifting effort provided a natural laboratory for exploring organizational change processes. This article describes the underlying change framework and implementation strategy used to operationalize QUERI.</p> <p>Strategic approach to organizational change</p> <p>QUERI used an evidence-based organizational framework focused on three contextual elements: 1) cultural norms and values, in this case related to the role of health services researchers in evidence-based quality improvement; 2) capacity, in this case among researchers and key partners to engage in implementation research; 3) and supportive infrastructures to reinforce expectations for change and to sustain new behaviors as part of the norm. As part of a QUERI Series in <it>Implementation Science</it>, this article describes the framework's application in an innovative integration of health services research, policy, and clinical care delivery.</p> <p>Conclusion</p> <p>QUERI's experience and success provide a case study in organizational change. It demonstrates that progress requires a strategic, systems-based effort. QUERI's evidence-based initiative involved a deliberate cultural shift, requiring ongoing commitment in multiple forms and at multiple levels. VA's commitment to QUERI came in the form of visionary leadership, targeted allocation of resources, infrastructure refinements, innovative peer review and study methods, and direct involvement of key stakeholders. Stakeholders included both those providing and managing clinical care, as well as those producing relevant evidence within the health care system. The organizational framework and related implementation interventions used to achieve contextual change resulted in engaged investigators and enhanced uptake of research knowledge. QUERI's approach and progress provide working hypotheses for others pursuing similar system-wide efforts to routinely achieve evidence-based care.</p
Evaluating the successful implementation of evidence into practice using the PARiHS framework : theoretical and practical challenges
Background
The PARiHS framework (Promoting Action on Research Implementation in Health Services) has proved to be a useful practical and conceptual heuristic for many researchers and practitioners in framing their research or knowledge translation endeavours. However, as a conceptual framework it still remains untested and therefore its contribution to the overall development and testing of theory in the field of implementation science is largely unquantified.
Discussion
This being the case, the paper provides an integrated summary of our conceptual and theoretical thinking so far and introduces a typology (derived from social policy analysis) used to distinguish between the terms conceptual framework, theory and model – important definitional and conceptual issues in trying to refine theoretical and methodological approaches to knowledge translation.
Secondly, the paper describes the next phase of our work, in particular concentrating on the conceptual thinking and mapping that has led to the generation of the hypothesis that the PARiHS framework is best utilised as a two-stage process: as a preliminary (diagnostic and evaluative) measure of the elements and sub-elements of evidence (E) and context (C), and then using the aggregated data from these measures to determine the most appropriate facilitation method. The exact nature of the intervention is thus determined by the specific actors in the specific context at a specific time and place.
In the process of refining this next phase of our work, we have had to consider the wider issues around the use of theories to inform and shape our research activity; the ongoing challenges of developing robust and sensitive measures; facilitation as an intervention for getting research into practice; and finally to note how the current debates around evidence into practice are adopting wider notions that fit innovations more generally.
Summary
The paper concludes by suggesting that the future direction of the work on the PARiHS framework is to develop a two-stage diagnostic and evaluative approach, where the intervention is shaped and moulded by the information gathered about the specific situation and from participating stakeholders. In order to expedite the generation of new evidence and testing of emerging theories, we suggest the formation of an international research implementation science collaborative that can systematically collect and analyse experiences of using and testing the PARiHS framework and similar conceptual and theoretical approaches.
We also recommend further refinement of the definitions around conceptual framework, theory, and model, suggesting a wider discussion that embraces multiple epistemological and ontological perspectives
Improving quality of care through routine, successful implementation of evidence-based practice at the bedside: an organizational case study protocol using the Pettigrew and Whipp model of strategic change
BACKGROUND: Evidence-based practice (EBP) is an expected approach to improving the quality of patient care and service delivery in health care systems internationally that is yet to be realized. Given the current evidence-practice gap, numerous authors describe barriers to achieving EBP. One recurrently identified barrier is the setting or context of practice, which is likewise cited as a potential part of the solution to the gap. The purpose of this study is to identify key contextual elements and related strategic processes in organizations that find and use evidence at multiple levels, in an ongoing, integrated fashion, in contrast to those that do not. METHODS: The core theoretical framework for this multi-method explanatory case study is Pettigrew and Whipp's Content, Context, and Process model of strategic change. This framework focuses data collection on three entities: the Why of strategic change, the What of strategic change, and the How of strategic change, in this case related to implementation and normalization of EBP. The data collection plan, designed to capture relevant organizational context and related outcomes, focuses on eight interrelated factors said to characterize a receptive context. Selective, purposive sampling will provide contrasting results between two cases (departments of nursing) and three embedded units in each. Data collection methods will include quantitative tools (e.g., regarding culture) and qualitative approaches including focus groups, interviews, and documents review (e.g., regarding integration and “success”) relevant to the EBP initiative. DISCUSSION: This study should provide information regarding contextual elements and related strategic processes key to successful implementation and sustainability of EBP, specifically in terms of a pervasive pattern in an acute care hospital-based health care setting. Additionally, this study will identify key contextual elements that differentiate successful implementation and sustainability of EBP efforts, both within varying levels of a hospital-based clinical setting and across similar hospital settings interested in EBP
A critical synthesis of literature on the promoting action on research implementation in health services (PARIHS) framework
<p>Abstract</p> <p>Background</p> <p>The Promoting Action on Research Implementation in Health Services framework, or PARIHS, is a conceptual framework that posits key, interacting elements that influence successful implementation of evidence-based practices. It has been widely cited and used as the basis for empirical work; however, there has not yet been a literature review to examine how the framework has been used in implementation projects and research. The purpose of the present article was to critically review and synthesize the literature on PARIHS to understand how it has been used and operationalized, and to highlight its strengths and limitations.</p> <p>Methods</p> <p>We conducted a qualitative, critical synthesis of peer-reviewed PARIHS literature published through March 2009. We synthesized findings through a three-step process using semi-structured data abstraction tools and group consensus.</p> <p>Results</p> <p>Twenty-four articles met our inclusion criteria: six core concept articles from original PARIHS authors, and eighteen empirical articles ranging from case reports to quantitative studies. Empirical articles generally used PARIHS as an organizing framework for analyses. No studies used PARIHS prospectively to design implementation strategies, and there was generally a lack of detail about how variables were measured or mapped, or how conclusions were derived. Several studies used findings to comment on the framework in ways that could help refine or validate it. The primary issue identified with the framework was a need for greater conceptual clarity regarding the definition of sub-elements and the nature of dynamic relationships. Strengths identified included its flexibility, intuitive appeal, explicit acknowledgement of the outcome of 'successful implementation,' and a more expansive view of what can and should constitute 'evidence.'</p> <p>Conclusions</p> <p>While we found studies reporting empirical support for PARIHS, the single greatest need for this and other implementation models is rigorous, prospective use of the framework to guide implementation projects. There is also need to better explain derived findings and how interventions or measures are mapped to specific PARIHS elements; greater conceptual discrimination among sub-elements may be necessary first. In general, it may be time for the implementation science community to develop consensus guidelines for reporting the use and usefulness of theoretical frameworks within implementation studies.</p
A realistic evaluation : the case of protocol-based care
Background
'Protocol based care' was envisioned by policy makers as a mechanism for delivering on the service improvement agenda in England. Realistic evaluation is an increasingly popular approach, but few published examples exist, particularly in implementation research. To fill this gap, within this paper we describe the application of a realistic evaluation approach to the study of protocol-based care, whilst sharing findings of relevance about standardising care through the use of protocols, guidelines, and pathways.
Methods
Situated between positivism and relativism, realistic evaluation is concerned with the identification of underlying causal mechanisms, how they work, and under what conditions. Fundamentally it focuses attention on finding out what works, for whom, how, and in what circumstances.
Results
In this research, we were interested in understanding the relationships between the type and nature of particular approaches to protocol-based care (mechanisms), within different clinical settings (context), and what impacts this resulted in (outcomes). An evidence review using the principles of realist synthesis resulted in a number of propositions, i.e., context, mechanism, and outcome threads (CMOs). These propositions were then 'tested' through multiple case studies, using multiple methods including non-participant observation, interviews, and document analysis through an iterative analysis process. The initial propositions (conjectured CMOs) only partially corresponded to the findings that emerged during analysis. From the iterative analysis process of scrutinising mechanisms, context, and outcomes we were able to draw out some theoretically generalisable features about what works, for whom, how, and what circumstances in relation to the use of standardised care approaches (refined CMOs).
Conclusions
As one of the first studies to apply realistic evaluation in implementation research, it was a good fit, particularly given the growing emphasis on understanding how context influences evidence-based practice. The strengths and limitations of the approach are considered, including how to operationalise it and some of the challenges. This approach provided a useful interpretive framework with which to make sense of the multiple factors that were simultaneously at play and being observed through various data sources, and for developing explanatory theory about using standardised care approaches in practice
Role of "external facilitation" in implementation of research findings: a qualitative evaluation of facilitation experiences in the Veterans Health Administration
BACKGROUND: Facilitation has been identified in the literature as a potentially key component of successful implementation. It has not, however, either been well-defined or well-studied. Significant questions remain about the operational definition of facilitation and about the relationship of facilitation to other interventions, especially to other change agent roles when used in multi-faceted implementation projects. Researchers who are part of the Quality Enhancement Research Initiative (QUERI) are actively exploring various approaches and processes, including facilitation, to enable implementation of best practices in the Veterans Health Administration health care system – the largest integrated healthcare system in the United States. This paper describes a systematic, retrospective evaluation of implementation-related facilitation experiences within QUERI, a quality improvement program developed by the US Department of Veterans Affairs. METHODS: A post-hoc evaluation was conducted through a series of semi-structured interviews to examine the concept of facilitation across several multi-site QUERI implementation studies. The interview process is based on a technique developed in the field of education, which systematically enhances learning through experience by stimulating recall and reflection regarding past complex activities. An iterative content analysis approach relative to a set of conceptually-based interview questions was used for data analysis. FINDINGS: Findings suggest that facilitation, within an implementation study initiated by a central change agency, is a deliberate and valued process of interactive problem solving and support that occurs in the context of a recognized need for improvement and a supportive interpersonal relationship. Facilitation was described primarily as a distinct role with a number of potentially crucial behaviors and activities. Data further suggest that external facilitators were likely to use or integrate other implementation interventions, while performing this problem-solving and supportive role. PRELIMINARY CONCLUSIONS: This evaluation provides evidence to suggest that facilitation could be considered a distinct implementation intervention, just as audit and feedback, educational outreach, or similar methods are considered to be discrete interventions. As such, facilitation should be well-defined and explicitly evaluated for its perceived usefulness within multi-intervention implementation projects. Additionally, researchers should better define the specific contribution of facilitation to the success of implementation in different types of projects, different types of sites, and with evidence and innovations of varying levels of strength and complexity
Validation of the conceptual research utilization scale: an application of the standards for educational and psychological testing in healthcare
<p>Abstract</p> <p>Background</p> <p>There is a lack of acceptable, reliable, and valid survey instruments to measure conceptual research utilization (CRU). In this study, we investigated the psychometric properties of a newly developed scale (the CRU Scale).</p> <p>Methods</p> <p>We used the <it>Standards for Educational and Psychological Testing </it>as a validation framework to assess four sources of validity evidence: content, response processes, internal structure, and relations to other variables. A panel of nine international research utilization experts performed a formal content validity assessment. To determine response process validity, we conducted a series of one-on-one scale administration sessions with 10 healthcare aides. Internal structure and relations to other variables validity was examined using CRU Scale response data from a sample of 707 healthcare aides working in 30 urban Canadian nursing homes. Principal components analysis and confirmatory factor analyses were conducted to determine internal structure. Relations to other variables were examined using: (1) bivariate correlations; (2) change in mean values of CRU with increasing levels of other kinds of research utilization; and (3) multivariate linear regression.</p> <p>Results</p> <p>Content validity index scores for the five items ranged from 0.55 to 1.00. The principal components analysis predicted a 5-item 1-factor model. This was inconsistent with the findings from the confirmatory factor analysis, which showed best fit for a 4-item 1-factor model. Bivariate associations between CRU and other kinds of research utilization were statistically significant (p < 0.01) for the latent CRU scale score and all five CRU items. The CRU scale score was also shown to be significant predictor of overall research utilization in multivariate linear regression.</p> <p>Conclusions</p> <p>The CRU scale showed acceptable initial psychometric properties with respect to responses from healthcare aides in nursing homes. Based on our validity, reliability, and acceptability analyses, we recommend using a reduced (four-item) version of the CRU scale to yield sound assessments of CRU by healthcare aides. Refinement to the wording of one item is also needed. Planned future research will include: latent scale scoring, identification of variables that predict and are outcomes to conceptual research use, and longitudinal work to determine CRU Scale sensitivity to change.</p
Process evaluation for complex interventions in primary care: understanding trials using the normalization process model
Background: the Normalization Process Model is a conceptual tool intended to assist in understanding the factors that affect implementation processes in clinical trials and other evaluations of complex interventions. It focuses on the ways that the implementation of complex interventions is shaped by problems of workability and integration.Method: in this paper the model is applied to two different complex trials: (i) the delivery of problem solving therapies for psychosocial distress, and (ii) the delivery of nurse-led clinics for heart failure treatment in primary care.Results: application of the model shows how process evaluations need to focus on more than the immediate contexts in which trial outcomes are generated. Problems relating to intervention workability and integration also need to be understood. The model may be used effectively to explain the implementation process in trials of complex interventions.Conclusion: the model invites evaluators to attend equally to considering how a complex intervention interacts with existing patterns of service organization, professional practice, and professional-patient interaction. The justification for this may be found in the abundance of reports of clinical effectiveness for interventions that have little hope of being implemented in real healthcare setting
- …