85 research outputs found
Efficacious, effective, and embedded interventions: Implementation research in infectious disease control
Background: Research in infectious disease control is heavily skewed towards high end
technology; development of new drugs, vaccines and clinical interventions. Oft ignored, is the
evidence to inform the best strategies that ensure the embedding of interventions into health
systems and amongst populations. In this paper we undertake an analysis of the challenge in the
development of research for the sustainable implementation of disease control interventions.
Results: We highlight the fundamental differences between the research paradigms associated
with the development of technologies and interventions for disease control on the one hand and the research paradigms required for enhancing the sustainable uptake of those very same
interventions within the communities on the other. We provide a definition for implementation
research in an attempt to underscore its critical role and explore the multidisciplinary science
needed to address the challenges in disease control.
Conclusion: The greatest value for money in health research lies in the sustainable and effective implementation of already proven, efficacious solutions. The development of implementation research that can help provide some solutions on how this can be achieved is sorely needed
Systematic evaluation of implementation fidelity of complex interventions in health and social care
<p>Abstract</p> <p>Background</p> <p>Evaluation of an implementation process and its fidelity can give insight into the 'black box' of interventions. However, a lack of standardized methods for studying fidelity and implementation process have been reported, which might be one reason for the fact that few prior studies in the field of health service research have systematically evaluated interventions' implementation processes.</p> <p>The aim of this project is to systematically evaluate implementation fidelity and possible factors influencing fidelity of complex interventions in health and social care.</p> <p>Methods</p> <p>A modified version of The Conceptual Framework for Implementation Fidelity will be used as a conceptual model for the evaluation. The modification implies two additional moderating factors: context and recruitment. A systematic evaluation process was developed. Multiple case study method is used to investigate implementation of three complex health service interventions. Each case will be investigated in depth and longitudinally, using both quantitative and qualitative methods.</p> <p>Discussion</p> <p>This study is the first attempt to empirically test The Conceptual Framework for Implementation Fidelity. The study can highlight mechanism and factors of importance when implementing complex interventions. Especially the role of the moderating factors on implementation fidelity can be clarified.</p> <p>Trial Registration</p> <p>Supported Employment, SE, among people with severe mental illness -- a randomized controlled trial: NCT00960024.</p
Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science
Abstract Background Many interventions found to be effective in health services research studies fail to translate into meaningful patient care outcomes across multiple contexts. Health services researchers recognize the need to evaluate not only summative outcomes but also formative outcomes to assess the extent to which implementation is effective in a specific setting, prolongs sustainability, and promotes dissemination into other settings. Many implementation theories have been published to help promote effective implementation. However, they overlap considerably in the constructs included in individual theories, and a comparison of theories reveals that each is missing important constructs included in other theories. In addition, terminology and definitions are not consistent across theories. We describe the Consolidated Framework For Implementation Research (CFIR) that offers an overarching typology to promote implementation theory development and verification about what works where and why across multiple contexts. Methods We used a snowball sampling approach to identify published theories that were evaluated to identify constructs based on strength of conceptual or empirical support for influence on implementation, consistency in definitions, alignment with our own findings, and potential for measurement. We combined constructs across published theories that had different labels but were redundant or overlapping in definition, and we parsed apart constructs that conflated underlying concepts. Results The CFIR is composed of five major domains: intervention characteristics, outer setting, inner setting, characteristics of the individuals involved, and the process of implementation. Eight constructs were identified related to the intervention (e.g., evidence strength and quality), four constructs were identified related to outer setting (e.g., patient needs and resources), 12 constructs were identified related to inner setting (e.g., culture, leadership engagement), five constructs were identified related to individual characteristics, and eight constructs were identified related to process (e.g., plan, evaluate, and reflect). We present explicit definitions for each construct. Conclusion The CFIR provides a pragmatic structure for approaching complex, interacting, multi-level, and transient states of constructs in the real world by embracing, consolidating, and unifying key constructs from published implementation theories. It can be used to guide formative evaluations and build the implementation knowledge base across multiple studies and settings.http://deepblue.lib.umich.edu/bitstream/2027.42/78272/1/1748-5908-4-50.xmlhttp://deepblue.lib.umich.edu/bitstream/2027.42/78272/2/1748-5908-4-50-S1.PDFhttp://deepblue.lib.umich.edu/bitstream/2027.42/78272/3/1748-5908-4-50-S3.PDFhttp://deepblue.lib.umich.edu/bitstream/2027.42/78272/4/1748-5908-4-50-S4.PDFhttp://deepblue.lib.umich.edu/bitstream/2027.42/78272/5/1748-5908-4-50.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/78272/6/1748-5908-4-50-S2.PDFPeer Reviewe
Feedback reporting of survey data to healthcare aides
BackgroundThis project occurred during the course of the Translating Research in Elder Care (TREC) program of research. TREC is a multilevel and longitudinal research program being conducted in the three Canadian Prairie Provinces of Alberta, Saskatchewan, and Manitoba. The main purpose of TREC is to increase understanding about the role of organizational context in influencing knowledge use in residential long-term care settings. The purpose of this study was to evaluate healthcare aides’ (HCAs) perceptions of a one-page poster designed to feed back aggregated data (including demographic information and perceptions about influences on best practice) from the TREC survey they had recently completed. MethodsA convenience sample of 7 of the 15 nursing homes participating in the TREC research program in Alberta were invited to participate. Specific facility-level summary data were provided to each facility in the form of a one-page poster report. Two weeks following delivery of the report, a convenience sample of HCAs was surveyed using one-to-one structured interviews. ResultsOne hundred twenty-three HCAs responded to the evaluation survey. Overall, HCAs’ opinions about presentation of the feedback report and the understandability, usability, and usefulness of the content were positive. For each report, analysis of data and production and inspection of the report took up to one hour. Information sessions to introduce and explain the reports averaged 18 minutes. Two feedback reports (minimum) were supplied to each facility at a cost of CAN$2.39 per report, for printing and laminating. ConclusionsThis study highlights not only the feasibility of producing understandable, usable, and useful feedback reports of survey data but also the value and importance of providing feedback to survey respondents. More broadly, the findings suggest that modest strategies may have a positive and desirable effect in participating sites. <br /
Observational measure of implementation progress in community based settings: The Stages of implementation completion (SIC)
<p>Abstract</p> <p>Background</p> <p>An increasingly large body of research is focused on designing and testing strategies to improve knowledge about how to embed evidence-based programs (EBP) into community settings. Development of strategies for overcoming barriers and increasing the effectiveness and pace of implementation is a high priority. Yet, there are few research tools that measure the implementation process itself. The Stages of Implementation Completion (SIC) is an observation-based measure that is used to track the time to achievement of key implementation milestones in an EBP being implemented in 51 counties in 53 sites (two counties have two sites) in two states in the United States.</p> <p>Methods</p> <p>The SIC was developed in the context of a randomized trial comparing the effectiveness of two implementation strategies: community development teams (experimental condition) and individualized implementation (control condition). Fifty-one counties were randomized to experimental or control conditions for implementation of multidimensional treatment foster care (MTFC), an alternative to group/residential care placement for children and adolescents. Progress through eight implementation stages was tracked by noting dates of completion of specific activities in each stage. Activities were tailored to the strategies for implementing the specific EBP.</p> <p>Results</p> <p>Preliminary data showed that several counties ceased progress during pre-implementation and that there was a high degree of variability among sites in the duration scores per stage and on the proportion of activities that were completed in each stage. Progress through activities and stages for three example counties is shown.</p> <p>Conclusions</p> <p>By assessing the attainment time of each stage and the proportion of activities completed, the SIC measure can be used to track and compare the effectiveness of various implementation strategies. Data from the SIC will provide sites with relevant information on the time and resources needed to implement MTFC during various phases of implementation. With some modifications, the SIC could be appropriate for use in evaluating implementation strategies in head-to-head randomized implementation trials and as a monitoring tool for rolling out other EBPs.</p
Scaling Up Global Health Interventions: A Proposed Framework for Success
Drawing upon interviews with experts and a review of the literature, Gavin Yamey proposes a new framework for scaling up global health interventions
Design of a randomized controlled study of a multi-professional and multidimensional intervention targeting frail elderly people
<p>Abstract</p> <p>Background</p> <p>Frail elderly people need an integrated and coordinated care. The two-armed study "Continuum of care for frail elderly people" is a multi-professional and multidimensional intervention for frail community-dwelling elderly people. It was designed to evaluate whether the intervention programme for frail elderly people can reduce the number of visits to hospital, increase satisfaction with health and social care and maintain functional abilities. The implementation process is explored and analysed along with the intervention. In this paper we present the study design, the intervention and the outcome measures as well as the baseline characteristics of the study participants.</p> <p>Methods/design</p> <p>The study is a randomised two-armed controlled trial with follow ups at 3, 6 and 12 months. The study group includes elderly people who sought care at the emergency ward and discharged to their own homes in the community. Inclusion criteria were 80 years and older <it>or </it>65 to 79 years with at least one chronic disease and dependent in at least one activity of daily living. Exclusion criteria were acute severely illness with an immediate need of the assessment and treatment by a physician, severe cognitive impairment and palliative care. The intention was that the study group should comprise a representative sample of frail elderly people at a high risk of future health care consumption. The intervention includes an early geriatric assessment, early family support, a case manager in the community with a multi-professional team and the involvement of the elderly people and their relatives in the planning process.</p> <p>Discussion</p> <p>The design of the study, the randomisation procedure and the protocol meetings were intended to ensure the quality of the study. The implementation of the intervention programme is followed and analysed throughout the whole study, which enables us to generate knowledge on the process of implementing complex interventions. The intervention contributes to early recognition of both the elderly peoples' needs of information, care and rehabilitation and of informal caregivers' need of support and information. This study is expected to show positive effects on frail elderly peoples' health care consumption, functional abilities and satisfaction with health and social care.</p> <p>Trial registration</p> <p>ClinicalTrials.gov: <a href="http://www.clinicaltrials.gov/ct2/show/NCT01260493">NCT01260493</a></p
Establishing an implementation network: lessons learned from community-based participatory research
<p>Abstract</p> <p>Background</p> <p>Implementation of evidence-based mental health assessment and intervention in community public health practice is a high priority for multiple stakeholders. Academic-community partnerships can assist in the implementation of efficacious treatments in community settings; yet, little is known about the processes by which these collaborations are developed. In this paper, we discuss our application of community-based participatory research (CBPR) approach to implementation, and we present six lessons we have learned from the establishment of an academic-community partnership.</p> <p>Methods</p> <p>With older adults with psychosis as a focus, we have developed a partnership between a university research center and a public mental health service system based on CBPR. The long-term goal of the partnership is to collaboratively establish an evidence-based implementation network that is sustainable within the public mental healthcare system.</p> <p>Results</p> <p>In building a sustainable partnership, we found that the following lessons were instrumental: changing attitudes; sharing staff; expecting obstacles and formalizing solutions; monitoring and evaluating; adapting and adjusting; and taking advantage of emerging opportunities. Some of these lessons were previously known principles that were modified as the result of the CBPR process, while some lessons derived directly from the interactive process of forming the partnership.</p> <p>Conclusion</p> <p>The process of forming of academic-public partnerships is challenging and time consuming, yet crucial for the development and implementation of state-of-the-art approaches to assessment and interventions to improve the functioning and quality of life for persons with serious mental illnesses. These partnerships provide necessary organizational support to facilitate the implementation of clinical research findings in community practice benefiting consumers, researchers, and providers.</p
Fidelity and moderating factors in complex interventions: a case study of a continuum of care program for frail elderly people in health and social care
<p>Abstract</p> <p>Background</p> <p>Prior studies measuring fidelity of complex interventions have mainly evaluated adherence, and not taken factors affecting adherence into consideration. A need for studies that clarify the concept of fidelity and the function of factors moderating fidelity has been emphasized. The aim of the study was to systematically evaluate implementation fidelity and possible factors influencing fidelity of a complex care continuum intervention for frail elderly people.</p> <p>Methods</p> <p>The intervention was a systematization of the collaboration between a nurse with geriatric expertise situated at the emergency department, the hospital ward staff, and a multi-professional team with a case manager in the municipal care services for older people. Implementation was evaluated between September 2008 and May 2010 with observations of work practices, stakeholder interviews, and document analysis according to a modified version of The Conceptual Framework for Implementation Fidelity.</p> <p>Results</p> <p>A total of 16 of the 18 intervention components were to a great extent delivered as planned, while some new components were added to the model. No changes in the frequency or duration of the 18 components were observed, but the dose of the added components varied over time. Changes in fidelity were caused in a complex, interrelated fashion by all the moderating factors in the framework, i.e., context, staff and participant responsiveness, facilitation, recruitment, and complexity.</p> <p>Discussion</p> <p>The Conceptual Framework for Implementation Fidelity was empirically useful and included comprehensive measures of factors affecting fidelity. Future studies should focus on developing the framework with regard to how to investigate relationships between the moderating factors and fidelity over time.</p> <p>Trial registration</p> <p>ClinicalTrials.gov, <a href="http://www.clinicaltrials.gov/ct2/show/NCT01260493">NCT01260493</a>.</p
A comparison of policy and direct practice stakeholder perceptions of factors affecting evidence-based practice implementation using concept mapping
<p>Abstract</p> <p>Background</p> <p>The goal of this study was to assess potential differences between administrators/policymakers and those involved in direct practice regarding factors believed to be barriers or facilitating factors to evidence-based practice (EBP) implementation in a large public mental health service system in the United States.</p> <p>Methods</p> <p>Participants included mental health system county officials, agency directors, program managers, clinical staff, administrative staff, and consumers. As part of concept mapping procedures, brainstorming groups were conducted with each target group to identify specific factors believed to be barriers or facilitating factors to EBP implementation in a large public mental health system. Statements were sorted by similarity and rated by each participant in regard to their perceived importance and changeability. Multidimensional scaling, cluster analysis, descriptive statistics and <it>t</it>-tests were used to analyze the data.</p> <p>Results</p> <p>A total of 105 statements were distilled into 14 clusters using concept-mapping procedures. Perceptions of importance of factors affecting EBP implementation varied between the two groups, with those involved in direct practice assigning significantly higher ratings to the importance of Clinical Perceptions and the impact of EBP implementation on clinical practice. Consistent with previous studies, financial concerns (costs, funding) were rated among the most important and least likely to change by both groups.</p> <p>Conclusions</p> <p>EBP implementation is a complex process, and different stakeholders may hold different opinions regarding the relative importance of the impact of EBP implementation. Implementation efforts must include input from stakeholders at multiple levels to bring divergent and convergent perspectives to light.</p
- …
