19 research outputs found
Intentional research design in implementation science: implications for the use of nomothetic and idiographic assessment
The advancement of implementation science is dependent on identifying assessment strategies that can address implementation and clinical outcome variables in ways that are valid, relevant to stakeholders, and scalable. This paper presents a measurement agenda for implementation science that integrates the previously disparate assessment traditions of idiographic and nomothetic approaches. Although idiographic and nomothetic approaches are both used in implementation science, a review of the literature on this topic suggests that their selection can be indiscriminate, driven by convenience, and not explicitly tied to research study design. As a result, they are not typically combined deliberately or effectively. Thoughtful integration may simultaneously enhance both the rigor and relevance of assessments across multiple levels within health service systems. Background on nomothetic and idiographic assessment is provided as well as their potential to support research in implementation science. Drawing from an existing framework, seven structures (of various sequencing and weighting options) and five functions (Convergence, Complementarity, Expansion, Development, Sampling) for integrating conceptually distinct research methods are articulated as they apply to the deliberate, design-driven integration of nomothetic and idiographic assessment approaches. Specific examples and practical guidance are provided to inform research consistent with this framework. Selection and integration of idiographic and nomothetic assessments for implementation science research designs can be improved. The current paper argues for the deliberate application of a clear framework to improve the rigor and relevance of contemporary assessment strategies
An updated protocol for a systematic review of implementation-related measures
Abstract Background Implementation science is the study of strategies used to integrate evidence-based practices into real-world settings (Eccles and Mittman, Implement Sci. 1(1):1, 2006). Central to the identification of replicable, feasible, and effective implementation strategies is the ability to assess the impact of contextual constructs and intervention characteristics that may influence implementation, but several measurement issues make this work quite difficult. For instance, it is unclear which constructs have no measures and which measures have any evidence of psychometric properties like reliability and validity. As part of a larger set of studies to advance implementation science measurement (Lewis et al., Implement Sci. 10:102, 2015), we will complete systematic reviews of measures that map onto the Consolidated Framework for Implementation Research (Damschroder et al., Implement Sci. 4:50, 2009) and the Implementation Outcomes Framework (Proctor et al., Adm Policy Ment Health. 38(2):65-76, 2011), the protocol for which is described in this manuscript. Methods Our primary databases will be PubMed and Embase. Our search strings will be comprised of five levels: (1) the outcome or construct term; (2) terms for measure; (3) terms for evidence-based practice; (4) terms for implementation; and (5) terms for mental health. Two trained research specialists will independently review all titles and abstracts followed by full-text review for inclusion. The research specialists will then conduct measure-forward searches using the “cited by” function to identify all published empirical studies using each measure. The measure and associated publications will be compiled in a packet for data extraction. Data relevant to our Psychometric and Pragmatic Evidence Rating Scale (PAPERS) will be independently extracted and then rated using a worst score counts methodology reflecting “poor” to “excellent” evidence. Discussion We will build a centralized, accessible, searchable repository through which researchers, practitioners, and other stakeholders can identify psychometrically and pragmatically strong measures of implementation contexts, processes, and outcomes. By facilitating the employment of psychometrically and pragmatically strong measures identified through this systematic review, the repository would enhance the cumulativeness, reproducibility, and applicability of research findings in the rapidly growing field of implementation science
Recommended from our members
Proceedings of the 3rd Biennial Conference of the Society for Implementation Research Collaboration (SIRC) 2015: advancing efficient methodologies through community partnerships and team science : Seattle, WA, USA. 24-26 September 2015.
Introduction to the 3rd Biennial Conference of the Society for Implementation Research Collaboration: advancing efficient methodologies through team science and community partnerships Cara Lewis, Doyanne Darnell, Suzanne Kerns, Maria Monroe-DeVita, Sara J. Landes, Aaron R. Lyon, Cameo Stanick, Shannon Dorsey, Jill Locke, Brigid Marriott, Ajeng Puspitasari, Caitlin Dorsey, Karin Hendricks, Andria Pierson, Phil Fizur, Katherine A. Comtois A1: A behavioral economic perspective on adoption, implementation, and sustainment of evidence-based interventions Lawrence A. Palinkas A2: Towards making scale up of evidence-based practices in child welfare systems more efficient and affordable Patricia Chamberlain A3: Mixed method examination of strategic leadership for evidence-based practice implementation Gregory A. Aarons, Amy E. Green, Mark. G. Ehrhart, Elise M. Trott, Cathleen E. Willging A4: Implementing practice change in Federally Qualified Health Centers: Learning from leaders’ experiences Maria E. Fernandez, Nicholas H. Woolf, Shuting (Lily) Liang, Natalia I. Heredia, Michelle Kegler, Betsy Risendal, Andrea Dwyer, Vicki Young, Dayna Campbell, Michelle Carvalho, Yvonne Kellar-Guenther A3: Mixed method examination of strategic leadership for evidence-based practice implementation Gregory A. Aarons, Amy E. Green, Mark. G. Ehrhart, Elise M. Trott, Cathleen E. Willging A4: Implementing practice change in Federally Qualified Health Centers: Learning from leaders’ experiences Maria E. Fernandez, Nicholas H. Woolf, Shuting (Lily) Liang, Natalia I. Heredia, Michelle Kegler, Betsy Risendal, Andrea Dwyer, Vicki Young, Dayna Campbell, Michelle Carvalho, Yvonne Kellar-Guenther A5: Efficient synthesis: Using qualitative comparative analysis and the Consolidated Framework for Implementation Research across diverse studies Laura J. Damschroder, Julie C. Lowery A6: Establishing a veterans engagement group to empower patients and inform Veterans Affairs (VA) health services research Sarah S. Ono, Kathleen F. Carlson, Erika K. Cottrell, Maya E. O’Neil, Travis L. Lovejoy A7: Building patient-practitioner partnerships in community oncology settings to implement behavioral interventions for anxious and depressed cancer survivors Joanna J. Arch, Jill L. Mitchell A8: Tailoring a Cognitive Behavioral Therapy implementation protocol using mixed methods, conjoint analysis, and implementation teams Cara C. Lewis, Brigid R. Marriott, Kelli Scott A9: Wraparound Structured Assessment and Review (WrapSTAR): An efficient, yet comprehensive approach to Wraparound implementation evaluation Jennifer Schurer Coldiron, Eric J. Bruns, Alyssa N. Hook A10: Improving the efficiency of standardized patient assessment of clinician fidelity: A comparison of automated actor-based and manual clinician-based ratings Benjamin C. Graham, Katelin Jordan A11: Measuring fidelity on the cheap Rochelle F. Hanson, Angela Moreland, Benjamin E. Saunders, Heidi S. Resnick A12: Leveraging routine clinical materials to assess fidelity to an evidence-based psychotherapy Shannon Wiltsey Stirman, Cassidy A. Gutner, Jennifer Gamarra, Dawne Vogt, Michael Suvak, Jennifer Schuster Wachen, Katherine Dondanville, Jeffrey S. Yarvis, Jim Mintz, Alan L. Peterson, Elisa V. Borah, Brett T. Litz, Alma Molino, Stacey Young McCaughanPatricia A. Resick A13: The video vignette survey: An efficient process for gathering diverse community opinions to inform an intervention Nancy Pandhi, Nora Jacobson, Neftali Serrano, Armando Hernandez, Elizabeth Zeidler- Schreiter, Natalie Wietfeldt, Zaher Karp A14: Using integrated administrative data to evaluate implementation of a behavioral health and trauma screening for children and youth in foster care Michael D. Pullmann, Barbara Lucenko, Bridget Pavelle, Jacqueline A. Uomoto, Andrea Negrete, Molly Cevasco, Suzanne E. U. Kerns A15: Intermediary organizations as a vehicle to promote efficiency and speed of implementation Robert P. Franks, Christopher Bory A16: Applying the Consolidated Framework for Implementation Research constructs directly to qualitative data: The power of implementation science in action Edward J. Miech, Teresa M. Damush A17: Efficient and effective scaling-up, screening, brief interventions, and referrals to treatment (SBIRT) training: a snowball implementation model Jason Satterfield, Derek Satre, Maria Wamsley, Patrick Yuan, Patricia O’Sullivan A18: Matching models of implementation to system needs and capacities: addressing the human factor Helen Best, Susan Velasquez A19: Agency characteristics that facilitate efficient and successful implementation efforts Miya Barnett, Lauren Brookman-Frazee, Jennifer Regan, Nicole Stadnick, Alison Hamilton, Anna Lau A20: Rapid assessment process: Application to the Prevention and Early Intervention transformation in Los Angeles County Jennifer Regan, Alison Hamilton, Nicole Stadnick, Miya Barnett, Anna Lau, Lauren Brookman-Frazee A21: The development of the Evidence-Based Practice-Concordant Care Assessment: An assessment tool to examine treatment strategies across practices Nicole Stadnick, Anna Lau, Miya Barnett, Jennifer Regan, Scott Roesch, Lauren Brookman-Frazee A22: Refining a compilation of discrete implementation strategies and determining their importance and feasibility Byron J. Powell, Thomas J. Waltz, Matthew J. Chinman, Laura Damschroder, Jeffrey L. Smith, Monica M. Matthieu, Enola K. Proctor, JoAnn E. Kirchner A23: Structuring complex recommendations: Methods and general findings Thomas J. Waltz, Byron J. Powell, Matthew J. Chinman, Laura J. Damschroder, Jeffrey L. Smith, Monica J. Matthieu, Enola K. Proctor, JoAnn E. Kirchner A24: Implementing prolonged exposure for post-traumatic stress disorder in the Department of Veterans Affairs: Expert recommendations from the Expert Recommendations for Implementing Change (ERIC) project Monica M. Matthieu, Craig S. Rosen, Thomas J. Waltz, Byron J. Powell, Matthew J. Chinman, Laura J. Damschroder, Jeffrey L. Smith, Enola K. Proctor, JoAnn E. Kirchner A25: When readiness is a luxury: Co-designing a risk assessment and quality assurance process with violence prevention frontline workers in Seattle, WA Sarah C. Walker, Asia S. Bishop, Mariko Lockhart A26: Implementation potential of structured recidivism risk assessments with justice- involved veterans: Qualitative perspectives from providers Allison L. Rodriguez, Luisa Manfredi, Andrea Nevedal, Joel Rosenthal, Daniel M. Blonigen A27: Developing empirically informed readiness measures for providers and agencies for the Family Check-Up using a mixed methods approach Anne M. Mauricio, Thomas D. Dishion, Jenna Rudo-Stern, Justin D. Smith A28: Pebbles, rocks, and boulders: The implementation of a school-based social engagement intervention for children with autism Jill Locke, Courtney Benjamin Wolk, Colleen Harker, Anne Olsen, Travis Shingledecker, Frances Barg, David Mandell, Rinad S. Beidas A29: Problem Solving Teletherapy (PST.Net): A stakeholder analysis examining the feasibility and acceptability of teletherapy in community based aging services Marissa C. Hansen, Maria P. Aranda, Isabel Torres-Vigil A30: A case of collaborative intervention design eventuating in behavior therapy sustainment and diffusion Bryan Hartzler A31: Implementation of suicide risk prevention in an integrated delivery system: Mental health specialty services Bradley Steinfeld, Tory Gildred, Zandrea Harlin, Fredric Shephard A32: Implementation team, checklist, evaluation, and feedback (ICED): A step-by-step approach to Dialectical Behavior Therapy program implementation Matthew S. Ditty, Andrea Doyle, John A. Bickel III, Katharine Cristaudo A33: The challenges in implementing muliple evidence-based practices in a community mental health setting Dan Fox, Sonia Combs A34: Using electronic health record technology to promote and support evidence-based practice assessment and treatment intervention David H. Lischner A35: Are existing frameworks adequate for measuring implementation outcomes? Results from a new simulation methodology Richard A. Van Dorn, Stephen J. Tueller, Jesse M. Hinde, Georgia T. Karuntzos A36: Taking global local: Evaluating training of Washington State clinicians in a modularized cogntive behavioral therapy approach designed for low-resource settings Maria Monroe-DeVita, Roselyn Peterson, Doyanne Darnell, Lucy Berliner, Shannon Dorsey, Laura K. Murray A37: Attitudes toward evidence-based practices across therapeutic orientations Yevgeny Botanov, Beverly Kikuta, Tianying Chen, Marivi Navarro-Haro, Anthony DuBose, Kathryn E. Korslund, Marsha M. Linehan A38: Predicting the use of an evidence-based intervention for autism in birth-to-three programs Colleen M. Harker, Elizabeth A. Karp, Sarah R. Edmunds, Lisa V. Ibañez, Wendy L. Stone A39: Supervision practices and improved fidelity across evidence-based practices: A literature review Mimi Choy-Brown A40: Beyond symptom tracking: clinician perceptions of a hybrid measurement feedback system for monitoring treatment fidelity and client progress Jack H. Andrews, Benjamin D. Johnides, Estee M. Hausman, Kristin M. Hawley A41: A guideline decision support tool: From creation to implementation Beth Prusaczyk, Alex Ramsey, Ana Baumann, Graham Colditz, Enola K. Proctor A42: Dabblers, bedazzlers, or total makeovers: Clinician modification of a common elements cognitive behavioral therapy approach Rosemary D. Meza, Shannon Dorsey, Shannon Wiltsey-Stirman, Georganna Sedlar, Leah Lucid A43: Characterization of context and its role in implementation: The impact of structure, infrastructure, and metastructure Caitlin Dorsey, Brigid Marriott, Nelson Zounlome, Cara Lewis A44: Effects of consultation method on implementation of cognitive processing therapy for post-traumatic stress disorder Cassidy A. Gutner, Candice M. Monson, Norman Shields, Marta Mastlej, Meredith SH Landy, Jeanine Lane, Shannon Wiltsey Stirman A45: Cross-validation of the Implementation Leadership Scale factor structure in child welfare service organizations Natalie K. Finn, Elisa M. Torres, Mark. G. Ehrhart, Gregory A. Aarons A46: Sustainability of integrated smoking cessation care in Veterans Affairs posttraumatic stress disorder clinics: A qualitative analysis of focus group data from learning collaborative participants Carol A. Malte, Aline Lott, Andrew J. Saxon A47: Key characteristics of effective mental health trainers: The creation of the Measure of Effective Attributes of Trainers (MEAT) Meredith Boyd, Kelli Scott, Cara C. Lewis A48: Coaching to improve teacher implementation of evidence-based practices (EBPs) Jennifer D. Pierce A49: Factors influencing the implementation of peer-led health promotion programs targeting seniors: A literature review Agathe Lorthios-Guilledroit, Lucie Richard, Johanne Filiatrault A50: Developing treatment fidelity rating systems for psychotherapy research: Recommendations and lessons learned Kevin Hallgren, Shirley Crotwell, Rosa Muñoz, Becky Gius, Benjamin Ladd, Barbara McCrady, Elizabeth Epstein A51: Rapid translation of alcohol prevention science John D. Clapp, Danielle E. Ruderman A52: Factors implicated in successful implementation: evidence to inform improved implementation from high and low-income countries Melanie Barwick, Raluca Barac, Stanley Zlotkin, Laila Salim, Marnie Davidson A53: Tracking implementation strategies prospectively: A practical approach Alicia C. Bunger, Byron J. Powell, Hillary A. Robertson A54: Trained but not implementing: the need for effective implementation planning tools Christopher Botsko A55: Evidence, context, and facilitation variables related to implementation of Dialectical Behavior Therapy: Qualitative results from a mixed methods inquiry in the Department of Veterans Affairs Sara J. Landes, Brandy N. Smith, Allison L. Rodriguez, Lindsay R. Trent, Monica M. Matthieu A56: Learning from implementation as usual in children’s mental health Byron J. Powell, Enola K. Proctor A57: Rates and predictors of implementation after Dialectical Behavior Therapy Intensive Training Melanie S. Harned, Marivi Navarro-Haro, Kathryn E. Korslund, Tianying Chen, Anthony DuBose, André Ivanoff, Marsha M. Linehan A58: Socio-contextual determinants of research evidence use in public-youth systems of care Antonio R. Garcia, Minseop Kim, Lawrence A. Palinkas, Lonnie Snowden, John Landsverk A59: Community resource mapping to integrate evidence-based depression treatment in primary care in Brazil: A pilot project Annika C. Sweetland, Maria Jose Fernandes, Edilson Santos, Cristiane Duarte, Afrânio Kritski, Noa Krawczyk, Caitlin Nelligan, Milton L. Wainberg A60: The use of concept mapping to efficiently identify determinants of implementation in the National Institute of Health--President’s Emergent Plan for AIDS Relief Prevention of Mother to Child HIV Transmission Implementation Science Alliance Gregory A. Aarons, David H. Sommerfeld, Benjamin Chi, Echezona Ezeanolue, Rachel Sturke, Lydia Kline, Laura Guay, George Siberry A61: Longitudinal remote consultation for implementing collaborative care for depression Ian M. Bennett, Rinad Beidas, Rachel Gold, Johnny Mao, Diane Powers, Mindy Vredevoogd, Jurgen Unutzer A62: Integrating a peer coach model to support program implementation and ensure long- term sustainability of the Incredible Years in community-based settings Jennifer Schroeder, Lane Volpe, Julie Steffen A63: Efficient sustainability: Existing community based supervisors as evidence-based treatment supports Shannon Dorsey, Michael D Pullmann, Suzanne E. U. Kerns, Nathaniel Jungbluth, Lucy Berliner, Kelly Thompson, Eliza Segell A64: Establishment of a national practice-based implementation network to accelerate adoption of evidence-based and best practices Pearl McGee-Vincent, Nancy Liu, Robyn Walser, Jennifer Runnals, R. Keith Shaw, Sara J. Landes, Craig Rosen, Janet Schmidt, Patrick Calhoun A65: Facilitation as a mechanism of implementation in a practice-based implementation network: Improving care in a Department of Veterans Affairs post-traumatic stress disorder outpatient clinic Ruth L. Varkovitzky, Sara J. Landes A66: The ACT SMART Toolkit: An implementation strategy for community-based organizations providing services to children with autism spectrum disorder Amy Drahota, Jonathan I. Martinez, Brigitte Brikho, Rosemary Meza, Aubyn C. Stahmer, Gregory A. Aarons A67: Supporting Policy In Health with Research: An intervention trial (SPIRIT) - protocol and early findings Anna Williamson A68: From evidence based practice initiatives to infrastructure: Lessons learned from a public behavioral health system’s efforts to promote evidence based practices Ronnie M. Rubin, Byron J. Powell, Matthew O. Hurford, Shawna L. Weaver, Rinad S. Beidas, David S. Mandell, Arthur C. Evans A69: Applying the policy ecology model to Philadelphia’s behavioral health transformation efforts Byron J. Powell, Rinad S. Beidas, Ronnie M. Rubin, Rebecca E. Stewart, Courtney Benjamin Wolk, Samantha L. Matlin, Shawna Weaver, Matthew O. Hurford, Arthur C. Evans, Trevor R. Hadley, David S. Mandell A70: A model for providing methodological expertise to advance dissemination and implementation of health discoveries in Clinical and Translational Science Award institutions Donald R. Gerke, Beth Prusaczyk, Ana Baumann, Ericka M. Lewis, Enola K. Proctor A71: Establishing a research agenda for the Triple P Implementation Framework Jenna McWilliam, Jacquie Brown, Michelle Tucker A72: Cheap and fast, but what is “best?”: Examining implementation outcomes across sites in a state-wide scaled-up evidence-based walking program, Walk With Ease Kathleen P Conte A73: Measurement feedback systems in mental health: Initial review of capabilities and characteristics Aaron R. Lyon, Meredith Boyd, Abigail Melvin, Cara C. Lewis, Freda Liu, Nathaniel Jungbluth A74: A qualitative investigation of case managers’ attitudes toward implementation of a measurement feedback system in a public mental health system for youth Amelia Kotte, Kaitlin A. Hill, Albert C. Mah, Priya A. Korathu-Larson, Janelle R. Au, Sonia Izmirian, Scott Keir, Brad J. Nakamura, Charmaine K. Higa-McMillan A75: Multiple pathways to sustainability: Using Qualitative Comparative Analysis to uncover the necessary and sufficient conditions for successful community-based implementation Brittany Rhoades Cooper, Angie Funaiole, Eleanor Dizon A76: Prescribers’ perspectives on opioids and benzodiazepines and medication alerts to reduce co-prescribing of these medications Eric J. Hawkins, Carol A. Malte, Hildi J. Hagedorn, Douglas Berger, Anissa Frank, Aline Lott, Carol E. Achtmeyer, Anthony J. Mariano, Andrew J. Saxon A77: Adaptation of Coordinated Anxiety Learning and Management for comorbid anxiety and substance use disorders: Delivery of evidence-based treatment for anxiety in addictions treatment centers Kate Wolitzky-Taylor, Richard Rawson, Richard Ries, Peter Roy-Byrne, Michelle Craske A78: Opportunities and challenges of measuring program implementation with online surveys Dena Simmons, Catalina Torrente, Lori Nathanson, Grace Carroll A79: Observational assessment of fidelity to a family-centered prevention program: Effectiveness and efficiency Justin D. Smith, Kimbree Brown, Karina Ramos, Nicole Thornton, Thomas J. Dishion, Elizabeth A. Stormshak, Daniel S. Shaw, Melvin N. Wilson A80: Strategies and challenges in housing first fidelity: A multistate qualitative analysis Mimi Choy-Brown, Emmy Tiderington, Bikki Tran Smith, Deborah K. Padgett A81: Procurement and contracting as an implementation strategy: Getting To Outcomes® contracting Ronnie M. Rubin, Marilyn L. Ray, Abraham Wandersman, Andrea Lamont, Gordon Hannah, Kassandra A. Alia, Matthew O. Hurford, Arthur C. Evans A82: Web-based feedback to aid successful implementation: The interactive Stages of Implementation Completion (SIC)TM tool Lisa Saldana, Holle Schaper, Mark Campbell, Patricia Chamberlain A83: Efficient methodologies for monitoring fidelity in routine implementation: Lessons from the Allentown Social Emotional Learning Initiative Valerie B. Shapiro, B.K. Elizabeth Kim, Jennifer L. Fleming, Paul A. LeBuffe A84: The Society for Implementation Research Collaboration (SIRC) implementation development workshop: Results from a new methodology for enhancing implementation science proposals Sara J. Landes, Cara C. Lewis, Allison L. Rodriguez, Brigid R. Marriott, Katherine Anne Comtois A85: An update on the Society for Implementation Research Collaboration (SIRC) Instrument Review Projec
Implementation outcome instruments for use in physical healthcare settings: a systematic review
BACKGROUND: Implementation research aims to facilitate the timely and routine implementation and sustainment of evidence-based interventions and services. A glaring gap in this endeavour is the capability of researchers, healthcare practitioners and managers to quantitatively evaluate implementation efforts using psychometrically sound instruments. To encourage and support the use of precise and accurate implementation outcome measures, this systematic review aimed to identify and appraise studies that assess the measurement properties of quantitative implementation outcome instruments used in physical healthcare settings. METHOD: The following data sources were searched from inception to March 2019, with no language restrictions: MEDLINE, EMBASE, PsycINFO, HMIC, CINAHL and the Cochrane library. Studies that evaluated the measurement properties of implementation outcome instruments in physical healthcare settings were eligible for inclusion. Proctor et al.'s taxonomy of implementation outcomes was used to guide the inclusion of implementation outcomes: acceptability, appropriateness, feasibility, adoption, penetration, implementation cost and sustainability. Methodological quality of the included studies was assessed using the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist. Psychometric quality of the included instruments was assessed using the Contemporary Psychometrics checklist (ConPsy). Usability was determined by number of items per instrument. RESULTS: Fifty-eight publications reporting on the measurement properties of 55 implementation outcome instruments (65 scales) were identified. The majority of instruments assessed acceptability (n = 33), followed by appropriateness (n = 7), adoption (n = 4), feasibility (n = 4), penetration (n = 4) and sustainability (n = 3) of evidence-based practice. The methodological quality of individual scales was low, with few studies rated as 'excellent' for reliability (6/62) and validity (7/63), and both studies that assessed responsiveness rated as 'poor' (2/2). The psychometric quality of the scales was also low, with 12/65 scales scoring 7 or more out of 22, indicating greater psychometric strength. Six scales (6/65) rated as 'excellent' for usability. CONCLUSION: Investigators assessing implementation outcomes quantitatively should select instruments based on their methodological and psychometric quality to promote consistent and comparable implementation evaluations. Rather than developing ad hoc instruments, we encourage further psychometric testing of instruments with promising methodological and psychometric evidence. SYSTEMATIC REVIEW REGISTRATION: PROSPERO 2017 CRD42017065348
Proceedings of the 3rd Biennial Conference of the Society for Implementation Research Collaboration (SIRC) 2015: advancing efficient methodologies through community partnerships and team science
It is well documented that the majority of adults, children and families in need of evidence-based behavioral health interventionsi do not receive them [1, 2] and that few robust empirically supported methods for implementing evidence-based practices (EBPs) exist. The Society for Implementation Research Collaboration (SIRC) represents a burgeoning effort to advance the innovation and rigor of implementation research and is uniquely focused on bringing together researchers and stakeholders committed to evaluating the implementation of complex evidence-based behavioral health interventions. Through its diverse activities and membership, SIRC aims to foster the promise of implementation research to better serve the behavioral health needs of the population by identifying rigorous, relevant, and efficient strategies that successfully transfer scientific evidence to clinical knowledge for use in real world settings [3]. SIRC began as a National Institute of Mental Health (NIMH)-funded conference series in 2010 (previously titled the “Seattle Implementation Research Conference”; $150,000 USD for 3 conferences in 2011, 2013, and 2015) with the recognition that there were multiple researchers and stakeholdersi working in parallel on innovative implementation science projects in behavioral health, but that formal channels for communicating and collaborating with one another were relatively unavailable. There was a significant need for a forum within which implementation researchers and stakeholders could learn from one another, refine approaches to science and practice, and develop an implementation research agenda using common measures, methods, and research principles to improve both the frequency and quality with which behavioral health treatment implementation is evaluated. SIRC’s membership growth is a testament to this identified need with more than 1000 members from 2011 to the present.ii SIRC’s primary objectives are to: (1) foster communication and collaboration across diverse groups, including implementation researchers, intermediariesi, as well as community stakeholders (SIRC uses the term “EBP champions” for these groups) – and to do so across multiple career levels (e.g., students, early career faculty, established investigators); and (2) enhance and disseminate rigorous measures and methodologies for implementing EBPs and evaluating EBP implementation efforts. These objectives are well aligned with Glasgow and colleagues’ [4] five core tenets deemed critical for advancing implementation science: collaboration, efficiency and speed, rigor and relevance, improved capacity, and cumulative knowledge. SIRC advances these objectives and tenets through in-person conferences, which bring together multidisciplinary implementation researchers and those implementing evidence-based behavioral health interventions in the community to share their work and create professional connections and collaborations
Reactivity of the monoclonal antibody B72.3 with fetal antigen: correlation with expression of TAG-72 in human carcinomas.
The monoclonal antibody (MAb) B72.3 recognizes a mucin-like glycoprotein, TAG-72, which has been detected in a spectrum of human carcinomas, but not in the normal tissue counterparts. Using avidin-biotin-peroxidase complex (ABC) immunohistochemical techniques, MAb B72.3 was reacted with formalin-fixed, paraffin-embedded fetal and pediatric tissue sections to determine the extent of expression of the recognized antigen in these tissues. First trimester fetal tissues failed to express detectable antigen. Gastrointestinal epithelia from 13 to 34 weeks gestation demonstrated the most immunoreactivity with B72.3 although bronchial respiratory epithelium of the lung, transitional epithelium from the kidney, Hassall\u27s corpuscles of the thymus, and gonadal tissues from fetuses of both sexes were also reactive. The TAG-72 antigen was not detected in fetal breast, pancreas, liver, spleen, adrenal, or heart. Expression of the TAG-72 antigen in malignancy appears to correlate well with fetal tissue reactivity with B72.3
What Gets Measured Gets Done: How Mental Health Agencies can Leverage Measurement-Based Care for Better Patient Care, Clinician Supports, and Organizational Goals
Mental health clinicians and administrators are increasingly asked to collect and report treatment outcome data despite numerous challenges to select and use instruments in routine practice. Measurement-based care (MBC) is an evidence-based practice for improving patient care. We propose that data collected from MBC processes with patients can be strategically leveraged by agencies to also support clinicians and respond to accountability requirements. MBC data elements are outlined using the Precision Mental Health Framework (Bickman et al. in Adm Policy Mental Health Mental Health Serv Res 43:271-276, 2016), practical guidance is provided for agency administrators, and conceptual examples illustrate strategic applications of one or more instruments to meet various needs throughout the organization
An updated protocol for a systematic review of implementation-related measures
Abstract Background Implementation science is the study of strategies used to integrate evidence-based practices into real-world settings (Eccles and Mittman, Implement Sci. 1(1):1, 2006). Central to the identification of replicable, feasible, and effective implementation strategies is the ability to assess the impact of contextual constructs and intervention characteristics that may influence implementation, but several measurement issues make this work quite difficult. For instance, it is unclear which constructs have no measures and which measures have any evidence of psychometric properties like reliability and validity. As part of a larger set of studies to advance implementation science measurement (Lewis et al., Implement Sci. 10:102, 2015), we will complete systematic reviews of measures that map onto the Consolidated Framework for Implementation Research (Damschroder et al., Implement Sci. 4:50, 2009) and the Implementation Outcomes Framework (Proctor et al., Adm Policy Ment Health. 38(2):65-76, 2011), the protocol for which is described in this manuscript. Methods Our primary databases will be PubMed and Embase. Our search strings will be comprised of five levels: (1) the outcome or construct term; (2) terms for measure; (3) terms for evidence-based practice; (4) terms for implementation; and (5) terms for mental health. Two trained research specialists will independently review all titles and abstracts followed by full-text review for inclusion. The research specialists will then conduct measure-forward searches using the “cited by” function to identify all published empirical studies using each measure. The measure and associated publications will be compiled in a packet for data extraction. Data relevant to our Psychometric and Pragmatic Evidence Rating Scale (PAPERS) will be independently extracted and then rated using a worst score counts methodology reflecting “poor” to “excellent” evidence. Discussion We will build a centralized, accessible, searchable repository through which researchers, practitioners, and other stakeholders can identify psychometrically and pragmatically strong measures of implementation contexts, processes, and outcomes. By facilitating the employment of psychometrically and pragmatically strong measures identified through this systematic review, the repository would enhance the cumulativeness, reproducibility, and applicability of research findings in the rapidly growing field of implementation science