12 research outputs found
Implementation and sustainment of diverse practices in a large integrated health system: a mixed methods study
Background: One goal of health systems seeking to evolve into learning health systems is to accelerate the implementation and sustainment of evidence-based practices (EBPs). As part of this evolution, the Veterans Health Administration (VHA) developed the Innovation Ecosystem, which includes the Diffusion of Excellence (DoE), a program that identifies and diffuses Gold Status Practices (GSPs) across facilities. The DoE hosts an annual Shark Tank competition in which leaders bid on the opportunity to implement a GSP with 6 months of implementation support. Over 750 diverse practices were submitted in cohorts 2 and 3 of Shark Tank; 23 were designated GSPs and were implemented in 31 VA networks or facilities. As part of a national evaluation of the DoE, we identified factors contributing to GSP implementation and sustainment.
Methods: Our sequential mixed methods evaluation of cohorts 2 and 3 of Shark Tank included semi-structured interviews with at least one representative from 30/31 implementing teams (N = 78/105 people invited) and survey responses from 29/31 teams (N = 39/47 invited). Interviews focused on factors influencing implementation and future sustainment. Surveys focused on sustainment 1.5-2 years after implementation. The Consolidated Framework for Implementation Research (CFIR) informed data collection and directed content analysis. Ordinal scales were developed inductively to rank implementation and sustainment outcomes.
Results: Over 50% of teams (17/30) successfully implemented their GSP within the 6-month implementation period. Despite extensive implementation support, significant barriers related to centralized decision-making, staffing, and resources led to partial (n = 6) or no (n = 7) implementation for the remaining teams. While 12/17 initially successful implementation teams reported sustained use of their GSP, over half of the initially unsuccessful teams (n = 7/13) also reported sustained GSP use 1.5 years after the initial implementation period. When asked at 6 months, 18/27 teams with complete data accurately anticipated their future sustainability based on reported sustainment an average of 1.5 years later.
Conclusions: Most teams implemented within 6 months and/or sustained their GSP 1.5 years later. High levels of implementation and sustainment across diverse practices and teams suggest that VHA\u27s DoE is a successful large-scale model of diffusion. Team predictions about sustainability after the first 6 months of implementation provide a promising early assessment and point of intervention to increase sustainability
Diffusion of excellence: evaluating a system to identify, replicate, and spread promising innovative practices across the Veterans health administration
IntroductionThe Veterans Health Administration (VHA) Diffusion of Excellence (DoE) program provides a system to identify, replicate, and spread promising practices across the largest integrated healthcare system in the United States. DoE identifies innovations that have been successfully implemented in the VHA through a Shark Tank style competition. VHA facility and regional directors bid resources needed to replicate promising practices. Winning facilities/regions receive external facilitation to aid in replication/implementation over the course of a year. DoE staff then support diffusion of successful practices across the nationwide VHA.MethodsOrganized around the Reach, Effectiveness, Adoption, Implementation, and Maintenance (RE-AIM) Framework, we summarize results of an ongoing long-term mixed-methods implementation evaluation of DoE. Data sources include: Shark Tank application and bid details, tracking practice adoptions through a Diffusion Marketplace, characteristics of VHA facilities, focus groups with Shark Tank bidders, structured observations of DoE events, surveys of DoE program participants, and semi-structured interviews of national VHA program office leaders, VHA healthcare system/facility executives, practice developers, implementation teams and facilitators.ResultsIn the first eight Shark Tanks (2016–2022), 3,280 Shark Tank applications were submitted; 88 were designated DoE Promising Practices (i.e., practices receive facilitated replication). DoE has effectively spread practices across the VHA, with 1,440 documented instances of adoption/replication of practices across the VHA. This includes 180 adoptions/replications in facilities located in rural areas. Leadership decisions to adopt innovations are often based on big picture considerations such as constituency support and linkage to organizational goals. DoE Promising Practices that have the greatest national spread have been successfully replicated at new sites during the facilitated replication process, have close partnerships with VHA national program offices, and tend to be less expensive to implement. Two indicators of sustainment indicate that 56 of the 88 Promising Practices are still being diffused across the VHA; 56% of facilities originally replicating the practices have sustained them, even up to 6 years after the first Shark Tank.ConclusionDoE has developed a sustainable process for the identification, replication, and spread of promising practices as part of a learning health system committed to providing equitable access to high quality care
Proceedings of the 3rd Biennial Conference of the Society for Implementation Research Collaboration (SIRC) 2015: advancing efficient methodologies through community partnerships and team science
It is well documented that the majority of adults, children and families in need of evidence-based behavioral health interventionsi do not receive them [1, 2] and that few robust empirically supported methods for implementing evidence-based practices (EBPs) exist. The Society for Implementation Research Collaboration (SIRC) represents a burgeoning effort to advance the innovation and rigor of implementation research and is uniquely focused on bringing together researchers and stakeholders committed to evaluating the implementation of complex evidence-based behavioral health interventions. Through its diverse activities and membership, SIRC aims to foster the promise of implementation research to better serve the behavioral health needs of the population by identifying rigorous, relevant, and efficient strategies that successfully transfer scientific evidence to clinical knowledge for use in real world settings [3]. SIRC began as a National Institute of Mental Health (NIMH)-funded conference series in 2010 (previously titled the “Seattle Implementation Research Conference”; $150,000 USD for 3 conferences in 2011, 2013, and 2015) with the recognition that there were multiple researchers and stakeholdersi working in parallel on innovative implementation science projects in behavioral health, but that formal channels for communicating and collaborating with one another were relatively unavailable. There was a significant need for a forum within which implementation researchers and stakeholders could learn from one another, refine approaches to science and practice, and develop an implementation research agenda using common measures, methods, and research principles to improve both the frequency and quality with which behavioral health treatment implementation is evaluated. SIRC’s membership growth is a testament to this identified need with more than 1000 members from 2011 to the present.ii SIRC’s primary objectives are to: (1) foster communication and collaboration across diverse groups, including implementation researchers, intermediariesi, as well as community stakeholders (SIRC uses the term “EBP champions” for these groups) – and to do so across multiple career levels (e.g., students, early career faculty, established investigators); and (2) enhance and disseminate rigorous measures and methodologies for implementing EBPs and evaluating EBP implementation efforts. These objectives are well aligned with Glasgow and colleagues’ [4] five core tenets deemed critical for advancing implementation science: collaboration, efficiency and speed, rigor and relevance, improved capacity, and cumulative knowledge. SIRC advances these objectives and tenets through in-person conferences, which bring together multidisciplinary implementation researchers and those implementing evidence-based behavioral health interventions in the community to share their work and create professional connections and collaborations
The Availability and Utility of Services to Address Risk Factors for Recidivism among Justice-Involved Veterans
The availability and utility of services to address recidivism risk factors among justice-involved veterans is unknown. We explored these issues through qualitative interviews with 63 Specialists from the Department of Veterans Affairs\u27 (VA) Veterans Justice Programs. To guide the interviews, we utilized the Risk-Need-Responsivity (RNR) model of offender rehabilitation. Specialists reported that justice-involved veterans generally have access to services to address most RNR-based risk factors (substance abuse; lack of positive school/work involvement; family/marital dysfunction; lack of prosocial activities/interests), but have less access to services targeting risk factors of antisocial tendencies and associates and empirically-based treatments for recidivism in VA. Peer-based services, motivational interviewing/cognitive-behavioral therapy, and Veterans Treatment Courts were perceived as useful to address multiple risk factors. These findings highlight potential gaps in provision of evidence-based care to address recidivism among justice-involved veterans, as well as promising policy-based solutions that may have widespread impact on reducing recidivism in this population
Rapid versus traditional qualitative analysis using the Consolidated Framework for Implementation Research (CFIR)
BACKGROUND: Qualitative approaches, alone or in mixed methods, are prominent within implementation science. However, traditional qualitative approaches are resource intensive, which has led to the development of rapid qualitative approaches. Published rapid approaches are often inductive in nature and rely on transcripts of interviews. We describe a deductive rapid analysis approach using the Consolidated Framework for Implementation Research (CFIR) that uses notes and audio recordings. This paper compares our rapid versus traditional deductive CFIR approach.
METHODS: Semi-structured interviews were conducted for two cohorts of the Veterans Health Administration (VHA) Diffusion of Excellence (DoE). The CFIR guided data collection and analysis. In cohort A, we used our traditional CFIR-based deductive analysis approach (directed content analysis), where two analysts completed independent in-depth manual coding of interview transcripts using qualitative software. In cohort B, we used our new rapid CFIR-based deductive analysis approach (directed content analysis), where the primary analyst wrote detailed notes during interviews and immediately coded notes into a MS Excel CFIR construct by facility matrix; a secondary analyst then listened to audio recordings and edited the matrix. We tracked time for our traditional and rapid deductive CFIR approaches using a spreadsheet and captured transcription costs from invoices. We retrospectively compared our approaches in terms of effectiveness and rigor.
RESULTS: Cohorts A and B were similar in terms of the amount of data collected. However, our rapid deductive CFIR approach required 409.5 analyst hours compared to 683 h during the traditional deductive CFIR approach. The rapid deductive approach eliminated $7250 in transcription costs. The facility-level analysis phase provided the greatest savings: 14 h/facility for the traditional analysis versus 3.92 h/facility for the rapid analysis. Data interpretation required the same number of hours for both approaches.
CONCLUSION: Our rapid deductive CFIR approach was less time intensive and eliminated transcription costs, yet effective in meeting evaluation objectives and establishing rigor. Researchers should consider the following when employing our approach: (1) team expertise in the CFIR and qualitative methods, (2) level of detail needed to meet project aims, (3) mode of data to analyze, and (4) advantages and disadvantages of using the CFIR
Sustainment of diverse evidence-informed practices disseminated in the Veterans Health Administration (VHA): initial development and piloting of a pragmatic survey tool
Abstract Background There are challenges associated with measuring sustainment of evidence-informed practices (EIPs). First, the terms sustainability and sustainment are often falsely conflated: sustainability assesses the likelihood of an EIP being in use in the future while sustainment assesses the extent to which an EIP is (or is not) in use. Second, grant funding often ends before sustainment can be assessed. The Veterans Health Administration (VHA) Diffusion of Excellence (DoE) program is one of few large-scale models of diffusion; it seeks to identify and disseminate practices across the VHA system. The DoE sponsors “Shark Tank” competitions, in which leaders bid on the opportunity to implement a practice with approximately 6 months of implementation support. As part of an ongoing evaluation of the DoE, we sought to develop and pilot a pragmatic survey tool to assess sustainment of DoE practices. Methods In June 2020, surveys were sent to 64 facilities that were part of the DoE evaluation. We began analysis by comparing alignment of quantitative and qualitative responses; some facility representatives reported in the open-text box of the survey that their practice was on a temporary hold due to COVID-19 but answered the primary outcome question differently. As a result, the team reclassified the primary outcome of these facilities to Sustained: Temporary COVID-Hold. Following this reclassification, the number and percent of facilities in each category was calculated. We used directed content analysis, guided by the Consolidated Framework for Implementation Research (CFIR), to analyze open-text box responses. Results A representative from forty-one facilities (64%) completed the survey. Among responding facilities, 29/41 sustained their practice, 1/41 partially sustained their practice, 8/41 had not sustained their practice, and 3/41 had never implemented their practice. Sustainment rates increased between Cohorts 1–4. Conclusions The initial development and piloting of our pragmatic survey allowed us to assess sustainment of DoE practices. Planned updates to the survey will enable flexibility in assessing sustainment and its determinants at any phase after adoption. This assessment approach can flex with the longitudinal and dynamic nature of sustainment, including capturing nuances in outcomes when practices are on a temporary hold. If additional piloting illustrates the survey is useful, we plan to assess the reliability and validity of this measure for broader use in the field
Women's experiences of living with neurogenic bladder and bowel after spinal cord injury: life controlled by bladder and bowel
Recommended from our members
Proceedings from the 9th annual conference on the science of dissemination and implementation
Proceedings of the 3rd Biennial Conference of the Society for Implementation Research Collaboration (SIRC) 2015: advancing efficient methodologies through community partnerships and team science
Table of contents
Introduction to the 3rd Biennial Conference of the Society for Implementation Research Collaboration: advancing efficient methodologies through team science and community partnerships
Cara Lewis, Doyanne Darnell, Suzanne Kerns, Maria Monroe-DeVita, Sara J. Landes, Aaron R. Lyon, Cameo Stanick, Shannon Dorsey, Jill Locke, Brigid Marriott, Ajeng Puspitasari, Caitlin Dorsey, Karin Hendricks, Andria Pierson, Phil Fizur, Katherine A. Comtois
A1: A behavioral economic perspective on adoption, implementation, and sustainment of evidence-based interventions
Lawrence A. Palinkas
A2: Towards making scale up of evidence-based practices in child welfare systems more efficient and affordable
Patricia Chamberlain
A3: Mixed method examination of strategic leadership for evidence-based practice implementation
Gregory A. Aarons, Amy E. Green, Mark. G. Ehrhart, Elise M. Trott, Cathleen E. Willging
A4: Implementing practice change in Federally Qualified Health Centers: Learning from leaders’ experiences
Maria E. Fernandez, Nicholas H. Woolf, Shuting (Lily) Liang, Natalia I. Heredia, Michelle Kegler, Betsy Risendal, Andrea Dwyer, Vicki Young, Dayna Campbell, Michelle Carvalho, Yvonne Kellar-Guenther
A3: Mixed method examination of strategic leadership for evidence-based practice implementation
Gregory A. Aarons, Amy E. Green, Mark. G. Ehrhart, Elise M. Trott, Cathleen E. Willging
A4: Implementing practice change in Federally Qualified Health Centers: Learning from leaders’ experiences
Maria E. Fernandez, Nicholas H. Woolf, Shuting (Lily) Liang, Natalia I. Heredia, Michelle Kegler, Betsy Risendal, Andrea Dwyer, Vicki Young, Dayna Campbell, Michelle Carvalho, Yvonne Kellar-Guenther
A5: Efficient synthesis: Using qualitative comparative analysis and the Consolidated Framework for Implementation Research across diverse studies
Laura J. Damschroder, Julie C. Lowery
A6: Establishing a veterans engagement group to empower patients and inform Veterans Affairs (VA) health services research
Sarah S. Ono, Kathleen F. Carlson, Erika K. Cottrell, Maya E. O’Neil, Travis L. Lovejoy
A7: Building patient-practitioner partnerships in community oncology settings to implement behavioral interventions for anxious and depressed cancer survivors
Joanna J. Arch, Jill L. Mitchell
A8: Tailoring a Cognitive Behavioral Therapy implementation protocol using mixed methods, conjoint analysis, and implementation teams
Cara C. Lewis, Brigid R. Marriott, Kelli Scott
A9: Wraparound Structured Assessment and Review (WrapSTAR): An efficient, yet comprehensive approach to Wraparound implementation evaluation
Jennifer Schurer Coldiron, Eric J. Bruns, Alyssa N. Hook
A10: Improving the efficiency of standardized patient assessment of clinician fidelity: A comparison of automated actor-based and manual clinician-based ratings
Benjamin C. Graham, Katelin Jordan
A11: Measuring fidelity on the cheap
Rochelle F. Hanson, Angela Moreland, Benjamin E. Saunders, Heidi S. Resnick
A12: Leveraging routine clinical materials to assess fidelity to an evidence-based psychotherapy
Shannon Wiltsey Stirman, Cassidy A. Gutner, Jennifer Gamarra, Dawne Vogt, Michael Suvak, Jennifer Schuster Wachen, Katherine Dondanville, Jeffrey S. Yarvis, Jim Mintz, Alan L. Peterson, Elisa V. Borah, Brett T. Litz, Alma Molino, Stacey Young McCaughanPatricia A. Resick
A13: The video vignette survey: An efficient process for gathering diverse community opinions to inform an intervention
Nancy Pandhi, Nora Jacobson, Neftali Serrano, Armando Hernandez, Elizabeth Zeidler- Schreiter, Natalie Wietfeldt, Zaher Karp
A14: Using integrated administrative data to evaluate implementation of a behavioral health and trauma screening for children and youth in foster care
Michael D. Pullmann, Barbara Lucenko, Bridget Pavelle, Jacqueline A. Uomoto, Andrea Negrete, Molly Cevasco, Suzanne E. U. Kerns
A15: Intermediary organizations as a vehicle to promote efficiency and speed of implementation
Robert P. Franks, Christopher Bory
A16: Applying the Consolidated Framework for Implementation Research constructs directly to qualitative data: The power of implementation science in action
Edward J. Miech, Teresa M. Damush
A17: Efficient and effective scaling-up, screening, brief interventions, and referrals to treatment (SBIRT) training: a snowball implementation model
Jason Satterfield, Derek Satre, Maria Wamsley, Patrick Yuan, Patricia O’Sullivan
A18: Matching models of implementation to system needs and capacities: addressing the human factor
Helen Best, Susan Velasquez
A19: Agency characteristics that facilitate efficient and successful implementation efforts
Miya Barnett, Lauren Brookman-Frazee, Jennifer Regan, Nicole Stadnick, Alison Hamilton, Anna Lau
A20: Rapid assessment process: Application to the Prevention and Early Intervention transformation in Los Angeles County
Jennifer Regan, Alison Hamilton, Nicole Stadnick, Miya Barnett, Anna Lau, Lauren Brookman-Frazee
A21: The development of the Evidence-Based Practice-Concordant Care Assessment: An assessment tool to examine treatment strategies across practices
Nicole Stadnick, Anna Lau, Miya Barnett, Jennifer Regan, Scott Roesch, Lauren Brookman-Frazee
A22: Refining a compilation of discrete implementation strategies and determining their importance and feasibility
Byron J. Powell, Thomas J. Waltz, Matthew J. Chinman, Laura Damschroder, Jeffrey L. Smith, Monica M. Matthieu, Enola K. Proctor, JoAnn E. Kirchner
A23: Structuring complex recommendations: Methods and general findings
Thomas J. Waltz, Byron J. Powell, Matthew J. Chinman, Laura J. Damschroder, Jeffrey L. Smith, Monica J. Matthieu, Enola K. Proctor, JoAnn E. Kirchner
A24: Implementing prolonged exposure for post-traumatic stress disorder in the Department of Veterans Affairs: Expert recommendations from the Expert Recommendations for Implementing Change (ERIC) project
Monica M. Matthieu, Craig S. Rosen, Thomas J. Waltz, Byron J. Powell, Matthew J. Chinman, Laura J. Damschroder, Jeffrey L. Smith, Enola K. Proctor, JoAnn E. Kirchner
A25: When readiness is a luxury: Co-designing a risk assessment and quality assurance process with violence prevention frontline workers in Seattle, WA
Sarah C. Walker, Asia S. Bishop, Mariko Lockhart
A26: Implementation potential of structured recidivism risk assessments with justice- involved veterans: Qualitative perspectives from providers
Allison L. Rodriguez, Luisa Manfredi, Andrea Nevedal, Joel Rosenthal, Daniel M. Blonigen
A27: Developing empirically informed readiness measures for providers and agencies for the Family Check-Up using a mixed methods approach
Anne M. Mauricio, Thomas D. Dishion, Jenna Rudo-Stern, Justin D. Smith
A28: Pebbles, rocks, and boulders: The implementation of a school-based social engagement intervention for children with autism
Jill Locke, Courtney Benjamin Wolk, Colleen Harker, Anne Olsen, Travis Shingledecker, Frances Barg, David Mandell, Rinad S. Beidas
A29: Problem Solving Teletherapy (PST.Net): A stakeholder analysis examining the feasibility and acceptability of teletherapy in community based aging services
Marissa C. Hansen, Maria P. Aranda, Isabel Torres-Vigil
A30: A case of collaborative intervention design eventuating in behavior therapy sustainment and diffusion
Bryan Hartzler
A31: Implementation of suicide risk prevention in an integrated delivery system: Mental health specialty services
Bradley Steinfeld, Tory Gildred, Zandrea Harlin, Fredric Shephard
A32: Implementation team, checklist, evaluation, and feedback (ICED): A step-by-step approach to Dialectical Behavior Therapy program implementation
Matthew S. Ditty, Andrea Doyle, John A. Bickel III, Katharine Cristaudo
A33: The challenges in implementing muliple evidence-based practices in a community mental health setting
Dan Fox, Sonia Combs
A34: Using electronic health record technology to promote and support evidence-based practice assessment and treatment intervention
David H. Lischner
A35: Are existing frameworks adequate for measuring implementation outcomes? Results from a new simulation methodology
Richard A. Van Dorn, Stephen J. Tueller, Jesse M. Hinde, Georgia T. Karuntzos
A36: Taking global local: Evaluating training of Washington State clinicians in a modularized cogntive behavioral therapy approach designed for low-resource settings
Maria Monroe-DeVita, Roselyn Peterson, Doyanne Darnell, Lucy Berliner, Shannon Dorsey, Laura K. Murray
A37: Attitudes toward evidence-based practices across therapeutic orientations
Yevgeny Botanov, Beverly Kikuta, Tianying Chen, Marivi Navarro-Haro, Anthony DuBose, Kathryn E. Korslund, Marsha M. Linehan
A38: Predicting the use of an evidence-based intervention for autism in birth-to-three programs
Colleen M. Harker, Elizabeth A. Karp, Sarah R. Edmunds, Lisa V. Ibañez, Wendy L. Stone
A39: Supervision practices and improved fidelity across evidence-based practices: A literature review
Mimi Choy-Brown
A40: Beyond symptom tracking: clinician perceptions of a hybrid measurement feedback system for monitoring treatment fidelity and client progress
Jack H. Andrews, Benjamin D. Johnides, Estee M. Hausman, Kristin M. Hawley
A41: A guideline decision support tool: From creation to implementation
Beth Prusaczyk, Alex Ramsey, Ana Baumann, Graham Colditz, Enola K. Proctor
A42: Dabblers, bedazzlers, or total makeovers: Clinician modification of a common elements cognitive behavioral therapy approach
Rosemary D. Meza, Shannon Dorsey, Shannon Wiltsey-Stirman, Georganna Sedlar, Leah Lucid
A43: Characterization of context and its role in implementation: The impact of structure, infrastructure, and metastructure
Caitlin Dorsey, Brigid Marriott, Nelson Zounlome, Cara Lewis
A44: Effects of consultation method on implementation of cognitive processing therapy for post-traumatic stress disorder
Cassidy A. Gutner, Candice M. Monson, Norman Shields, Marta Mastlej, Meredith SH Landy, Jeanine Lane, Shannon Wiltsey Stirman
A45: Cross-validation of the Implementation Leadership Scale factor structure in child welfare service organizations
Natalie K. Finn, Elisa M. Torres, Mark. G. Ehrhart, Gregory A. Aarons
A46: Sustainability of integrated smoking cessation care in Veterans Affairs posttraumatic stress disorder clinics: A qualitative analysis of focus group data from learning collaborative participants
Carol A. Malte, Aline Lott, Andrew J. Saxon
A47: Key characteristics of effective mental health trainers: The creation of the Measure of Effective Attributes of Trainers (MEAT)
Meredith Boyd, Kelli Scott, Cara C. Lewis
A48: Coaching to improve teacher implementation of evidence-based practices (EBPs)
Jennifer D. Pierce
A49: Factors influencing the implementation of peer-led health promotion programs targeting seniors: A literature review
Agathe Lorthios-Guilledroit, Lucie Richard, Johanne Filiatrault
A50: Developing treatment fidelity rating systems for psychotherapy research: Recommendations and lessons learned
Kevin Hallgren, Shirley Crotwell, Rosa Muñoz, Becky Gius, Benjamin Ladd, Barbara McCrady, Elizabeth Epstein
A51: Rapid translation of alcohol prevention science
John D. Clapp, Danielle E. Ruderman
A52: Factors implicated in successful implementation: evidence to inform improved implementation from high and low-income countries
Melanie Barwick, Raluca Barac, Stanley Zlotkin, Laila Salim, Marnie
Davidson
A53: Tracking implementation strategies prospectively: A practical approach
Alicia C. Bunger, Byron J. Powell, Hillary A. Robertson
A54: Trained but not implementing: the need for effective implementation planning tools
Christopher Botsko
A55: Evidence, context, and facilitation variables related to implementation of Dialectical Behavior Therapy: Qualitative results from a mixed methods inquiry in the Department of Veterans Affairs
Sara J. Landes, Brandy N. Smith, Allison L. Rodriguez, Lindsay R. Trent, Monica M. Matthieu
A56: Learning from implementation as usual in children’s mental health
Byron J. Powell, Enola K. Proctor
A57: Rates and predictors of implementation after Dialectical Behavior Therapy Intensive Training
Melanie S. Harned, Marivi Navarro-Haro, Kathryn E. Korslund, Tianying Chen, Anthony DuBose, André Ivanoff, Marsha M. Linehan
A58: Socio-contextual determinants of research evidence use in public-youth systems of care
Antonio R. Garcia, Minseop Kim, Lawrence A. Palinkas, Lonnie Snowden, John Landsverk
A59: Community resource mapping to integrate evidence-based depression treatment in primary care in Brazil: A pilot project
Annika C. Sweetland, Maria Jose Fernandes, Edilson Santos, Cristiane Duarte, Afrânio Kritski, Noa Krawczyk, Caitlin Nelligan, Milton L. Wainberg
A60: The use of concept mapping to efficiently identify determinants of implementation in the National Institute of Health--President’s Emergent Plan for AIDS Relief Prevention of Mother to Child HIV Transmission Implementation Science Alliance
Gregory A. Aarons, David H. Sommerfeld, Benjamin Chi, Echezona Ezeanolue, Rachel Sturke, Lydia Kline, Laura Guay, George Siberry
A61: Longitudinal remote consultation for implementing collaborative care for depression
Ian M. Bennett, Rinad Beidas, Rachel Gold, Johnny Mao, Diane Powers, Mindy Vredevoogd, Jurgen Unutzer
A62: Integrating a peer coach model to support program implementation and ensure long- term sustainability of the Incredible Years in community-based settings
Jennifer Schroeder, Lane Volpe, Julie Steffen
A63: Efficient sustainability: Existing community based supervisors as evidence-based treatment supports
Shannon Dorsey, Michael D Pullmann, Suzanne E. U. Kerns, Nathaniel Jungbluth, Lucy Berliner, Kelly Thompson, Eliza Segell
A64: Establishment of a national practice-based implementation network to accelerate adoption of evidence-based and best practices
Pearl McGee-Vincent, Nancy Liu, Robyn Walser, Jennifer Runnals, R. Keith Shaw, Sara J. Landes, Craig Rosen, Janet Schmidt, Patrick Calhoun
A65: Facilitation as a mechanism of implementation in a practice-based implementation network: Improving care in a Department of Veterans Affairs post-traumatic stress disorder outpatient clinic
Ruth L. Varkovitzky, Sara J. Landes
A66: The ACT SMART Toolkit: An implementation strategy for community-based organizations providing services to children with autism spectrum disorder
Amy Drahota, Jonathan I. Martinez, Brigitte Brikho, Rosemary Meza, Aubyn C. Stahmer, Gregory A. Aarons
A67: Supporting Policy In Health with Research: An intervention trial (SPIRIT) - protocol and early findings
Anna Williamson
A68: From evidence based practice initiatives to infrastructure: Lessons learned from a public behavioral health system’s efforts to promote evidence based practices
Ronnie M. Rubin, Byron J. Powell, Matthew O. Hurford, Shawna L. Weaver, Rinad S. Beidas, David S. Mandell, Arthur C. Evans
A69: Applying the policy ecology model to Philadelphia’s behavioral health transformation efforts
Byron J. Powell, Rinad S. Beidas, Ronnie M. Rubin, Rebecca E. Stewart, Courtney Benjamin Wolk, Samantha L. Matlin, Shawna Weaver, Matthew O. Hurford, Arthur C. Evans, Trevor R. Hadley, David S. Mandell
A70: A model for providing methodological expertise to advance dissemination and implementation of health discoveries in Clinical and Translational Science Award institutions
Donald R. Gerke, Beth Prusaczyk, Ana Baumann, Ericka M. Lewis, Enola K. Proctor
A71: Establishing a research agenda for the Triple P Implementation Framework
Jenna McWilliam, Jacquie Brown, Michelle Tucker
A72: Cheap and fast, but what is “best?”: Examining implementation outcomes across sites in a state-wide scaled-up evidence-based walking program, Walk With Ease
Kathleen P Conte
A73: Measurement feedback systems in mental health: Initial review of capabilities and characteristics
Aaron R. Lyon, Meredith Boyd, Abigail Melvin, Cara C. Lewis, Freda Liu, Nathaniel Jungbluth
A74: A qualitative investigation of case managers’ attitudes toward implementation of a measurement feedback system in a public mental health system for youth
Amelia Kotte, Kaitlin A. Hill, Albert C. Mah, Priya A. Korathu-Larson, Janelle R. Au, Sonia Izmirian, Scott Keir, Brad J. Nakamura, Charmaine K. Higa-McMillan
A75: Multiple pathways to sustainability: Using Qualitative Comparative Analysis to uncover the necessary and sufficient conditions for successful community-based implementation
Brittany Rhoades Cooper, Angie Funaiole, Eleanor Dizon
A76: Prescribers’ perspectives on opioids and benzodiazepines and medication alerts to reduce co-prescribing of these medications
Eric J. Hawkins, Carol A. Malte, Hildi J. Hagedorn, Douglas Berger, Anissa Frank, Aline Lott, Carol E. Achtmeyer, Anthony J. Mariano, Andrew J. Saxon
A77: Adaptation of Coordinated Anxiety Learning and Management for comorbid anxiety and substance use disorders: Delivery of evidence-based treatment for anxiety in addictions treatment centers
Kate Wolitzky-Taylor, Richard Rawson, Richard Ries, Peter Roy-Byrne, Michelle Craske
A78: Opportunities and challenges of measuring program implementation with online surveys
Dena Simmons, Catalina Torrente, Lori Nathanson, Grace Carroll
A79: Observational assessment of fidelity to a family-centered prevention program: Effectiveness and efficiency
Justin D. Smith, Kimbree Brown, Karina Ramos, Nicole Thornton, Thomas J. Dishion, Elizabeth A. Stormshak, Daniel S. Shaw, Melvin N. Wilson
A80: Strategies and challenges in housing first fidelity: A multistate qualitative analysis
Mimi Choy-Brown, Emmy Tiderington, Bikki Tran Smith, Deborah K. Padgett
A81: Procurement and contracting as an implementation strategy: Getting To Outcomes® contracting
Ronnie M. Rubin, Marilyn L. Ray, Abraham Wandersman, Andrea Lamont, Gordon Hannah, Kassandra A. Alia, Matthew O. Hurford, Arthur C. Evans
A82: Web-based feedback to aid successful implementation: The interactive Stages of Implementation Completion (SIC)TM tool
Lisa Saldana, Holle Schaper, Mark Campbell, Patricia Chamberlain
A83: Efficient methodologies for monitoring fidelity in routine implementation: Lessons from the Allentown Social Emotional Learning Initiative
Valerie B. Shapiro, B.K. Elizabeth Kim, Jennifer L. Fleming, Paul A. LeBuffe
A84: The Society for Implementation Research Collaboration (SIRC) implementation development workshop: Results from a new methodology for enhancing implementation science proposals
Sara J. Landes, Cara C. Lewis, Allison L. Rodriguez, Brigid R. Marriott, Katherine Anne Comtois
A85: An update on the Society for Implementation Research Collaboration (SIRC) Instrument Review Projec
Recommended from our members