34 research outputs found

    Enhancing access to alcohol use disorder pharmacotherapy and treatment in primary care settings: ADaPT-PC

    Get PDF
    Background: Only 7.8% of individuals meeting diagnostic criteria for alcohol use disorder (AUD) receive treatment in a given year. Most individuals with AUDs are identified in primary care (PC) settings and referred to substance use disorders (SUD) clinics; however, only a minority of those referred attend treatment services. Safe and effective pharmacological treatments for AUD exist, but they are rarely prescribed by PC providers. The objective of this study is to refine, implement, and evaluate an intervention to integrate pharmacological AUD treatment options into PC settings. This paper provides a detailed description of the intervention design and the evaluation components. Methods/design: Three large Veterans Health Administration (VHA) facilities are participating in the intervention. The intervention targets stakeholder groups with tailored strategies based on implementation theory and prior research identifying barriers to implementation of AUD pharmacotherapy. Local SUD providers and primary care mental health integration (PCMHI) providers are trained to serve as local implementation/clinical champions and receive external facilitation. PC providers receive access to consultation from local and national clinical champions, educational materials, and a dashboard of patients with AUD on their caseloads for case identification. Veterans with AUD diagnoses receive educational information in the mail just prior to a scheduled PC visit. Effectiveness of the intervention will be evaluated through an interrupted time series with matched controls to monitor change in facility level AUD pharmacotherapy prescribing rates. Following Stetler\u27s four-phase formative evaluation (FE) strategy, FE methods include (1) developmental FE (pre-implementation interviews with champions, PC providers, and Veterans), (2) implementation-focused FE (tracking attendance at facilitation meetings, academic detailing efforts by local champions, and patient dashboard utilization), (3) progress-focused FE (tracking rates of AUD pharmacotherapy prescribing and rates of referral to PCMHI and SUD specialty care), and (4) interpretive FE (post- implementation interviews with champions and PC providers). Analysis of FE data will be guided by the Consolidated Framework for Implementation Research (CFIR). Discussion: If demonstrated to be successful, this implementation strategy will provide a replicable, feasible, and relative low-cost method for integrating AUD treatment services into PC settings, thereby increasing access to AUD treatment

    Telephone care coordination for smokers in VA mental health clinics: protocol for a hybrid type-2 effectiveness-implementation trial

    Get PDF
    BACKGROUND: This paper describes an innovative protocol for a type-II hybrid effectiveness-implementation trial that is evaluating a smoking cessation telephone care coordination program for Veterans Health Administration (VA) mental-health clinic patients. As a hybrid trial, the protocol combines implementation science and clinical trial methods and outcomes that can inform future cessation studies and the implementation of tobacco cessation programs into routine care. The primary objectives of the trial are (1) to evaluate the process of adapting, implementing, and sustaining a smoking cessation telephone care coordination program in VA mental health clinics, (2) to determine the effectiveness of the program in promoting long-term abstinence from smoking among mental health patients, and (3) to compare the effectiveness of telephone counseling delivered by VA staff with that delivered by state quitlines. METHODS/DESIGN: The care coordination program is being implemented at six VA facilities. VA mental health providers refer patients to the program via an electronic medical record consult. Program staff call referred patients to offer enrollment. All patients who enroll receive a self-help booklet, mailed smoking cessation medications, and proactive multi-call telephone counseling. Participants are randomized to receive this counseling from VA staff or their state\u27s quitline. Four primary implementation strategies are being used to optimize program implementation and sustainability: blended facilitation, provider training, informatics support, and provider feedback. A three-phase formative evaluation is being conducted to identify barriers to, and facilitators for, program implementation and sustainability. A mixed-methods approach is being used to collect quantitative clinical effectiveness data (e.g., self-reported abstinence at six months) and both quantitative and qualitative implementation data (e.g., provider referral rates, coded interviews with providers). Summative data will be analyzed using the Reach Effectiveness Adoption Implementation Maintenance (RE-AIM) framework. DISCUSSION: This paper describes the rationale and methods of a trial designed to simultaneously study the clinical effectiveness and implementation of a telephone smoking cessation program for smokers using VA mental health clinics. Such hybrid designs are an important methodological design that can shorten the time between the development of an intervention and its translation into routine clinical care

    A critical synthesis of literature on the promoting action on research implementation in health services (PARIHS) framework

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The Promoting Action on Research Implementation in Health Services framework, or PARIHS, is a conceptual framework that posits key, interacting elements that influence successful implementation of evidence-based practices. It has been widely cited and used as the basis for empirical work; however, there has not yet been a literature review to examine how the framework has been used in implementation projects and research. The purpose of the present article was to critically review and synthesize the literature on PARIHS to understand how it has been used and operationalized, and to highlight its strengths and limitations.</p> <p>Methods</p> <p>We conducted a qualitative, critical synthesis of peer-reviewed PARIHS literature published through March 2009. We synthesized findings through a three-step process using semi-structured data abstraction tools and group consensus.</p> <p>Results</p> <p>Twenty-four articles met our inclusion criteria: six core concept articles from original PARIHS authors, and eighteen empirical articles ranging from case reports to quantitative studies. Empirical articles generally used PARIHS as an organizing framework for analyses. No studies used PARIHS prospectively to design implementation strategies, and there was generally a lack of detail about how variables were measured or mapped, or how conclusions were derived. Several studies used findings to comment on the framework in ways that could help refine or validate it. The primary issue identified with the framework was a need for greater conceptual clarity regarding the definition of sub-elements and the nature of dynamic relationships. Strengths identified included its flexibility, intuitive appeal, explicit acknowledgement of the outcome of 'successful implementation,' and a more expansive view of what can and should constitute 'evidence.'</p> <p>Conclusions</p> <p>While we found studies reporting empirical support for PARIHS, the single greatest need for this and other implementation models is rigorous, prospective use of the framework to guide implementation projects. There is also need to better explain derived findings and how interventions or measures are mapped to specific PARIHS elements; greater conceptual discrimination among sub-elements may be necessary first. In general, it may be time for the implementation science community to develop consensus guidelines for reporting the use and usefulness of theoretical frameworks within implementation studies.</p

    Enhancing access to alcohol use disorder pharmacotherapy and treatment in primary care settings: ADaPT-PC.

    Get PDF
    BACKGROUND: Only 7.8 % of individuals meeting diagnostic criteria for alcohol use disorder (AUD) receive treatment in a given year. Most individuals with AUDs are identified in primary care (PC) settings and referred to substance use disorders (SUD) clinics; however, only a minority of those referred attend treatment services. Safe and effective pharmacological treatments for AUD exist, but they are rarely prescribed by PC providers. The objective of this study is to refine, implement, and evaluate an intervention to integrate pharmacological AUD treatment options into PC settings. This paper provides a detailed description of the intervention design and the evaluation components. METHODS/DESIGN: Three large Veterans Health Administration (VHA) facilities are participating in the intervention. The intervention targets stakeholder groups with tailored strategies based on implementation theory and prior research identifying barriers to implementation of AUD pharmacotherapy. Local SUD providers and primary care mental health integration (PCMHI) providers are trained to serve as local implementation/clinical champions and receive external facilitation. PC providers receive access to consultation from local and national clinical champions, educational materials, and a dashboard of patients with AUD on their caseloads for case identification. Veterans with AUD diagnoses receive educational information in the mail just prior to a scheduled PC visit. Effectiveness of the intervention will be evaluated through an interrupted time series with matched controls to monitor change in facility level AUD pharmacotherapy prescribing rates. Following Stetler\u27s four-phase formative evaluation (FE) strategy, FE methods include (1) developmental FE (pre-implementation interviews with champions, PC providers, and Veterans), (2) implementation-focused FE (tracking attendance at facilitation meetings, academic detailing efforts by local champions, and patient dashboard utilization), (3) progress-focused FE (tracking rates of AUD pharmacotherapy prescribing and rates of referral to PCMHI and SUD specialty care), and (4) interpretive FE (post-implementation interviews with champions and PC providers). Analysis of FE data will be guided by the Consolidated Framework for Implementation Research (CFIR). DISCUSSION: If demonstrated to be successful, this implementation strategy will provide a replicable, feasible, and relative low-cost method for integrating AUD treatment services into PC settings, thereby increasing access to AUD treatment

    The relationship between baseline Organizational Readiness to Change Assessment subscale scores and implementation of hepatitis prevention services in substance use disorders treatment clinics: a case study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The Organizational Readiness to Change Assessment (ORCA) is a measure of organizational readiness for implementing practice change in healthcare settings that is organized based on the core elements and sub-elements of the Promoting Action on Research Implementation in Health Services (PARIHS) framework. General support for the reliability and factor structure of the ORCA has been reported. However, no published study has examined the utility of the ORCA in a clinical setting. The purpose of the current study was to examine the relationship between baseline ORCA scores and implementation of hepatitis prevention services in substance use disorders (SUD) clinics.</p> <p>Methods</p> <p>Nine clinic teams from Veterans Health Administration SUD clinics across the United States participated in a six-month training program to promote evidence-based practices for hepatitis prevention. A representative from each team completed the ORCA evidence and context subscales at baseline.</p> <p>Results</p> <p>Eight of nine clinics reported implementation of at least one new hepatitis prevention practice after completing the six-month training program. Clinic teams were categorized by level of implementation-high (n = 4) versus low (n = 5)-based on how many hepatitis prevention practices were integrated into their clinics after completing the training program. High implementation teams had significantly higher scores on the patient experience and leadership culture subscales of the ORCA compared to low implementation teams. While not reaching significance in this small sample, high implementation clinics also had higher scores on the research, clinical experience, staff culture, leadership behavior, and measurement subscales as compared to low implementation clinics.</p> <p>Conclusions</p> <p>The results of this study suggest that the ORCA was able to measure differences in organizational factors at baseline between clinics that reported high and low implementation of practice recommendations at follow-up. This supports the use of the ORCA to describe factors related to implementing practice recommendations in clinical settings. Future research utilizing larger sample sizes will be essential to support these preliminary findings.</p

    A Guide for applying a revised version of the PARIHS framework for implementation

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Based on a critical synthesis of literature on use of the Promoting Action on Research Implementation in Health Services (PARIHS) framework, revisions and a companion <it>Guide </it>were developed by a group of researchers independent of the original PARIHS team. The purpose of the <it>Guide </it>is to enhance and optimize efforts of researchers using PARIHS in implementation trials and evaluations.</p> <p>Methods</p> <p>Authors used a planned, structured process to organize and synthesize critiques, discussions, and potential recommendations for refinements of the PARIHS framework arising from a systematic review. Using a templated form, each author independently recorded key components for each reviewed paper; that is, study definitions, perceived strengths/limitations of PARIHS, other observations regarding key issues and recommendations regarding needed refinements. After reaching consensus on these key components, the authors summarized the information and developed the <it>Guide</it>.</p> <p>Results</p> <p>A number of revisions, perceived as consistent with the PARIHS framework's general nature and intent, are proposed. The related <it>Guide </it>is composed of a set of reference tools, provided in Additional files. Its core content is built upon the basic elements of PARIHS and current implementation science.</p> <p>Conclusions</p> <p>We invite researchers using PARIHS for targeted evidence-based practice (EBP) implementations with a strong task-orientation to use this <it>Guide </it>as a companion and to apply the revised framework prospectively and comprehensively. Researchers also are encouraged to evaluate its use relative to perceived strengths and issues. Such evaluations and critical reflections regarding PARIHS and our <it>Guide </it>could thereby promote the framework's continued evolution.</p

    Role of "external facilitation" in implementation of research findings: a qualitative evaluation of facilitation experiences in the Veterans Health Administration

    Get PDF
    BACKGROUND: Facilitation has been identified in the literature as a potentially key component of successful implementation. It has not, however, either been well-defined or well-studied. Significant questions remain about the operational definition of facilitation and about the relationship of facilitation to other interventions, especially to other change agent roles when used in multi-faceted implementation projects. Researchers who are part of the Quality Enhancement Research Initiative (QUERI) are actively exploring various approaches and processes, including facilitation, to enable implementation of best practices in the Veterans Health Administration health care system – the largest integrated healthcare system in the United States. This paper describes a systematic, retrospective evaluation of implementation-related facilitation experiences within QUERI, a quality improvement program developed by the US Department of Veterans Affairs. METHODS: A post-hoc evaluation was conducted through a series of semi-structured interviews to examine the concept of facilitation across several multi-site QUERI implementation studies. The interview process is based on a technique developed in the field of education, which systematically enhances learning through experience by stimulating recall and reflection regarding past complex activities. An iterative content analysis approach relative to a set of conceptually-based interview questions was used for data analysis. FINDINGS: Findings suggest that facilitation, within an implementation study initiated by a central change agency, is a deliberate and valued process of interactive problem solving and support that occurs in the context of a recognized need for improvement and a supportive interpersonal relationship. Facilitation was described primarily as a distinct role with a number of potentially crucial behaviors and activities. Data further suggest that external facilitators were likely to use or integrate other implementation interventions, while performing this problem-solving and supportive role. PRELIMINARY CONCLUSIONS: This evaluation provides evidence to suggest that facilitation could be considered a distinct implementation intervention, just as audit and feedback, educational outreach, or similar methods are considered to be discrete interventions. As such, facilitation should be well-defined and explicitly evaluated for its perceived usefulness within multi-intervention implementation projects. Additionally, researchers should better define the specific contribution of facilitation to the success of implementation in different types of projects, different types of sites, and with evidence and innovations of varying levels of strength and complexity

    Predicting implementation from organizational readiness for change: a study protocol

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>There is widespread interest in measuring organizational readiness to implement evidence-based practices in clinical care. However, there are a number of challenges to validating organizational measures, including inferential bias arising from the halo effect and method bias - two threats to validity that, while well-documented by organizational scholars, are often ignored in health services research. We describe a protocol to comprehensively assess the psychometric properties of a previously developed survey, the Organizational Readiness to Change Assessment.</p> <p>Objectives</p> <p>Our objective is to conduct a comprehensive assessment of the psychometric properties of the Organizational Readiness to Change Assessment incorporating methods specifically to address threats from halo effect and method bias.</p> <p>Methods and Design</p> <p>We will conduct three sets of analyses using longitudinal, secondary data from four partner projects, each testing interventions to improve the implementation of an evidence-based clinical practice. Partner projects field the Organizational Readiness to Change Assessment at baseline (n = 208 respondents; 53 facilities), and prospectively assesses the degree to which the evidence-based practice is implemented. We will conduct predictive and concurrent validities using hierarchical linear modeling and multivariate regression, respectively. For predictive validity, the outcome is the change from baseline to follow-up in the use of the evidence-based practice. We will use intra-class correlations derived from hierarchical linear models to assess inter-rater reliability. Two partner projects will also field measures of job satisfaction for convergent and discriminant validity analyses, and will field Organizational Readiness to Change Assessment measures at follow-up for concurrent validity (n = 158 respondents; 33 facilities). Convergent and discriminant validities will test associations between organizational readiness and different aspects of job satisfaction: satisfaction with leadership, which should be highly correlated with readiness, versus satisfaction with salary, which should be less correlated with readiness. Content validity will be assessed using an expert panel and modified Delphi technique.</p> <p>Discussion</p> <p>We propose a comprehensive protocol for validating a survey instrument for assessing organizational readiness to change that specifically addresses key threats of bias related to halo effect, method bias and questions of construct validity that often go unexplored in research using measures of organizational constructs.</p

    Proceedings of the 3rd Biennial Conference of the Society for Implementation Research Collaboration (SIRC) 2015: advancing efficient methodologies through community partnerships and team science

    Get PDF
    It is well documented that the majority of adults, children and families in need of evidence-based behavioral health interventionsi do not receive them [1, 2] and that few robust empirically supported methods for implementing evidence-based practices (EBPs) exist. The Society for Implementation Research Collaboration (SIRC) represents a burgeoning effort to advance the innovation and rigor of implementation research and is uniquely focused on bringing together researchers and stakeholders committed to evaluating the implementation of complex evidence-based behavioral health interventions. Through its diverse activities and membership, SIRC aims to foster the promise of implementation research to better serve the behavioral health needs of the population by identifying rigorous, relevant, and efficient strategies that successfully transfer scientific evidence to clinical knowledge for use in real world settings [3]. SIRC began as a National Institute of Mental Health (NIMH)-funded conference series in 2010 (previously titled the “Seattle Implementation Research Conference”; $150,000 USD for 3 conferences in 2011, 2013, and 2015) with the recognition that there were multiple researchers and stakeholdersi working in parallel on innovative implementation science projects in behavioral health, but that formal channels for communicating and collaborating with one another were relatively unavailable. There was a significant need for a forum within which implementation researchers and stakeholders could learn from one another, refine approaches to science and practice, and develop an implementation research agenda using common measures, methods, and research principles to improve both the frequency and quality with which behavioral health treatment implementation is evaluated. SIRC’s membership growth is a testament to this identified need with more than 1000 members from 2011 to the present.ii SIRC’s primary objectives are to: (1) foster communication and collaboration across diverse groups, including implementation researchers, intermediariesi, as well as community stakeholders (SIRC uses the term “EBP champions” for these groups) – and to do so across multiple career levels (e.g., students, early career faculty, established investigators); and (2) enhance and disseminate rigorous measures and methodologies for implementing EBPs and evaluating EBP implementation efforts. These objectives are well aligned with Glasgow and colleagues’ [4] five core tenets deemed critical for advancing implementation science: collaboration, efficiency and speed, rigor and relevance, improved capacity, and cumulative knowledge. SIRC advances these objectives and tenets through in-person conferences, which bring together multidisciplinary implementation researchers and those implementing evidence-based behavioral health interventions in the community to share their work and create professional connections and collaborations
    corecore