78 research outputs found

    Acceptability and feasibility of peer assisted supervision and support for intervention practitioners: a Q-methodology evaluation

    Get PDF
    Evidence-based interventions often include quality improvement methods to support fidelity and improve client outcomes. Clinical supervision is promoted as an effective way of developing practitioner confidence and competence in delivery; however, supervision is often inconsistent and embedded in hierarchical line management structures that may limit the opportunity for reflective learning. The Peer Assisted Supervision and Support (PASS) supervision model uses peer relationships to promote the self-regulatory capacity of practitioners to improve intervention delivery. The aim of the present study was to assess the acceptability and feasibility of PASS amongst parenting intervention practitioners. A Q-methodology approach was used to generate data and 30 practitioners volunteered to participate in the study. Data were analyzed and interpreted using standard Q-methodology procedures and by-person factor analysis yielded three factors. There was consensus that PASS was acceptable. Participants shared the view that PASS facilitated an environment of support where negative aspects of interpersonal relationships that might develop in supervision were not evident. Two factors represented the viewpoint that PASS was also a feasible model of supervision. However, the third factor was comprised of practitioners who reported that PASS could be time consuming and difficult to fit into existing work demands. There were differences across the three factors in the extent to which practitioners considered PASS impacted on their intervention delivery. The findings highlight the importance of organizational mechanisms that support practitioner engagement in supervision

    Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science

    Get PDF
    Abstract Background Many interventions found to be effective in health services research studies fail to translate into meaningful patient care outcomes across multiple contexts. Health services researchers recognize the need to evaluate not only summative outcomes but also formative outcomes to assess the extent to which implementation is effective in a specific setting, prolongs sustainability, and promotes dissemination into other settings. Many implementation theories have been published to help promote effective implementation. However, they overlap considerably in the constructs included in individual theories, and a comparison of theories reveals that each is missing important constructs included in other theories. In addition, terminology and definitions are not consistent across theories. We describe the Consolidated Framework For Implementation Research (CFIR) that offers an overarching typology to promote implementation theory development and verification about what works where and why across multiple contexts. Methods We used a snowball sampling approach to identify published theories that were evaluated to identify constructs based on strength of conceptual or empirical support for influence on implementation, consistency in definitions, alignment with our own findings, and potential for measurement. We combined constructs across published theories that had different labels but were redundant or overlapping in definition, and we parsed apart constructs that conflated underlying concepts. Results The CFIR is composed of five major domains: intervention characteristics, outer setting, inner setting, characteristics of the individuals involved, and the process of implementation. Eight constructs were identified related to the intervention (e.g., evidence strength and quality), four constructs were identified related to outer setting (e.g., patient needs and resources), 12 constructs were identified related to inner setting (e.g., culture, leadership engagement), five constructs were identified related to individual characteristics, and eight constructs were identified related to process (e.g., plan, evaluate, and reflect). We present explicit definitions for each construct. Conclusion The CFIR provides a pragmatic structure for approaching complex, interacting, multi-level, and transient states of constructs in the real world by embracing, consolidating, and unifying key constructs from published implementation theories. It can be used to guide formative evaluations and build the implementation knowledge base across multiple studies and settings.http://deepblue.lib.umich.edu/bitstream/2027.42/78272/1/1748-5908-4-50.xmlhttp://deepblue.lib.umich.edu/bitstream/2027.42/78272/2/1748-5908-4-50-S1.PDFhttp://deepblue.lib.umich.edu/bitstream/2027.42/78272/3/1748-5908-4-50-S3.PDFhttp://deepblue.lib.umich.edu/bitstream/2027.42/78272/4/1748-5908-4-50-S4.PDFhttp://deepblue.lib.umich.edu/bitstream/2027.42/78272/5/1748-5908-4-50.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/78272/6/1748-5908-4-50-S2.PDFPeer Reviewe

    The Cosmological Constant

    Get PDF
    This is a review of the physics and cosmology of the cosmological constant. Focusing on recent developments, I present a pedagogical overview of cosmology in the presence of a cosmological constant, observational constraints on its magnitude, and the physics of a small (and potentially nonzero) vacuum energy.Comment: 50 pages. Submitted to Living Reviews in Relativity (http://www.livingreviews.org/), December 199

    Employing external facilitation to implement cognitive behavioral therapy in VA clinics: a pilot study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Although for more than a decade healthcare systems have attempted to provide evidence-based mental health treatments, the availability and use of psychotherapies remains low. A significant need exists to identify simple but effective implementation strategies to adopt complex practices within complex systems of care. Emerging evidence suggests that facilitation may be an effective integrative implementation strategy for adoption of complex practices. The current pilot examined the use of external facilitation for adoption of cognitive behavioral therapy (CBT) in 20 Department of Veteran Affairs (VA) clinics.</p> <p>Methods</p> <p>The 20 clinics were paired on facility characteristics, and 23 clinicians from these were trained in CBT. A clinic in each pair was randomly selected to receive external facilitation. Quantitative methods were used to examine the extent of CBT implementation in 10 clinics that received external facilitation compared with 10 clinics that did not, and to better understand the relationship between individual providers' characteristics and attitudes and their CBT use. Costs of external facilitation were assessed by tracking the time spent by the facilitator and therapists in activities related to implementing CBT. Qualitative methods were used to explore contextual and other factors thought to influence implementation.</p> <p>Results</p> <p>Examination of change scores showed that facilitated therapists averaged an increase of 19% [95% CI: (2, 36)] in self-reported CBT use from baseline, while control therapists averaged a 4% [95% CI: (-14, 21)] increase. Therapists in the facilitated condition who were not providing CBT at baseline showed the greatest increase (35%) compared to a control therapist who was not providing CBT at baseline (10%) or to therapists in either condition who were providing CBT at baseline (average 3%). Increased CBT use was unrelated to prior CBT training. Barriers to CBT implementation were therapists' lack of control over their clinic schedule and poor communication with clinical leaders.</p> <p>Conclusions</p> <p>These findings suggest that facilitation may help clinicians make complex practice changes such as implementing an evidence-based psychotherapy. Furthermore, the substantial increase in CBT usage among the facilitation group was achieved at a modest cost.</p

    Establishing an implementation network: lessons learned from community-based participatory research

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Implementation of evidence-based mental health assessment and intervention in community public health practice is a high priority for multiple stakeholders. Academic-community partnerships can assist in the implementation of efficacious treatments in community settings; yet, little is known about the processes by which these collaborations are developed. In this paper, we discuss our application of community-based participatory research (CBPR) approach to implementation, and we present six lessons we have learned from the establishment of an academic-community partnership.</p> <p>Methods</p> <p>With older adults with psychosis as a focus, we have developed a partnership between a university research center and a public mental health service system based on CBPR. The long-term goal of the partnership is to collaboratively establish an evidence-based implementation network that is sustainable within the public mental healthcare system.</p> <p>Results</p> <p>In building a sustainable partnership, we found that the following lessons were instrumental: changing attitudes; sharing staff; expecting obstacles and formalizing solutions; monitoring and evaluating; adapting and adjusting; and taking advantage of emerging opportunities. Some of these lessons were previously known principles that were modified as the result of the CBPR process, while some lessons derived directly from the interactive process of forming the partnership.</p> <p>Conclusion</p> <p>The process of forming of academic-public partnerships is challenging and time consuming, yet crucial for the development and implementation of state-of-the-art approaches to assessment and interventions to improve the functioning and quality of life for persons with serious mental illnesses. These partnerships provide necessary organizational support to facilitate the implementation of clinical research findings in community practice benefiting consumers, researchers, and providers.</p

    Hepatitis C infection: eligibility for antiviral therapies

    Full text link
    peer reviewedBackground Current treatments of chronic hepatitis C virus (HCV) are effective, but expensive and susceptible to induce significant side effects. Objectives To evaluate the proportion of HCV patients who are eligible for a treatment. Methods In a database comprising 1726 viraemic HCV patients, the files of 299 patients who presented to the same hepatologist for an initial appointment between 1996 and 2003 were reviewed. Results Patients' characteristics were age 43.1 +/- 15.6 years, 53% male and 92% Caucasian. The main risk factors were transfusion (43%) and drug use (22%). Genotypes were mostly genotype 1 (66%), genotype 3 (12%) and genotype 2 (10%). These characteristics were not different from those of the whole series of 1726 patients. A total of 176 patients (59%) were not treated, the reasons for non-treatment being medical contraindications (34%), non-compliance (25%) and normal transaminases (24%). In addition, 17% of patients declined therapy despite being considered as eligible, mainly due to fear of adverse events. Medical contraindications were psychiatric (27%), age (22%), end-stage liver disease (15%), willingness for pregnancy (13%), cardiac contraindication (7%) and others (16%). Only 123 patients (41%) were treated. A sustained viral response was observed in 41%. The treatment was interrupted in 16% for adverse events. Conclusions The majority of HCV patients are not eligible for treatment. This implies that, with current therapies, only 17% of patients referred for chronic HCV become sustained responders. Some modifications of guidelines could extend the rate of treatment (patients with normal transaminases), but an important barrier remains the patients' and the doctors' fear of adverse events

    Understanding the implementation and effectiveness of a group-based early parenting intervention : a process evaluation protocol

    Get PDF
    BACKGROUND: Group-based early parenting interventions delivered through community-based services may be a potentially effective means of promoting infant and family health and wellbeing. Process evaluations of these complex interventions provide vital information on how they work, as well as the conditions which shape and influence outcomes. This information is critical to decision makers and service providers who wish to embed prevention and early interventions in usual care settings. In this paper, a process evaluation protocol for an early years parenting intervention, the Parent and Infant (PIN) program, is described. This program combines a range of developmentally-appropriate supports, delivered in a single intervention process, for parents and infants (0–2 years) and aimed at enhancing parental competence, strengthening parent-infant relationships and improving infant wellbeing and adjustment. METHODS: The process evaluation is embedded within a controlled trial and accompanying cost-effectiveness evaluation. Building from extant frameworks and evaluation methods, this paper presents a systematic approach to the process evaluation of the PIN program and its underlying change principles, the implementation of the program, the context of implementation and the change mechanisms which influence and shape parent and infant outcomes. We will use a multi-method strategy, including semi-structured interviews and group discussions with key stakeholders, documentary analysis and survey methodology. DISCUSSION: The integration of innovations into existing early years systems and services is a challenging multifaceted undertaking. This process evaluation will make an important contribution to knowledge about the implementation of such programs, while also providing an example of how theory-based research can be embedded within the evaluation of community-based interventions. We discuss the strengths of the research, such as the adoption of a collaborative approach to data collection, while we also identify potential challenges, including capturing and assessing complex aspects of the intervention. TRIAL REGISTRATION: ISRCTN17488830 (Date of registration: 27/11/15). This trial was retrospectively registered. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1186/s12913-016-1737-3) contains supplementary material, which is available to authorized users

    Measuring persistence of implementation: QUERI Series

    Get PDF
    As more quality improvement programs are implemented to achieve gains in performance, the need to evaluate their lasting effects has become increasingly evident. However, such long-term follow-up evaluations are scarce in healthcare implementation science, being largely relegated to the "need for further research" section of most project write-ups. This article explores the variety of conceptualizations of implementation sustainability, as well as behavioral and organizational factors that influence the maintenance of gains. It highlights the finer points of design considerations and draws on our own experiences with measuring sustainability, framed within the rich theoretical and empirical contributions of others. In addition, recommendations are made for designing sustainability analyses
    corecore