23 research outputs found

    TREATING TRAUMA WITHIN RURAL SCHOOLS: AN IMPLEMENTATION SCIENCE PERSPECTIVE

    Get PDF
    High rates of childhood trauma exposure (65-75%) are concerning given the negative outcomes associated with trauma-related symptoms. Numerous evidence-based practices (EBPs) have been developed to treat posttraumatic stress symptoms; however, schools often experience barriers to implementing these interventions with fidelity. Given the scarcity of service options within rural areas, this qualitative study explored factors that might influence the adoption and implementation of trauma-focused interventions within rural schools using the Consolidated Framework for Implementation Research (CFIR; Damschroder et al., 2009) and the Implementation Outcomes Framework (IOF; Proctor et al., 2011). A semi-structured protocol was used to interview clinicians working in rural schools (N = 12) about their use of trauma-focused interventions. Transcripts were double coded using a deductive content analysis approach and a CFIR- and IOF-based coding manual. Every participant reported adopting a mental health intervention to treat posttraumatic stress symptoms, though only 25% had adopted an EBP to treat trauma-related symptoms. One participant worked in a school that declined an opportunity to adopt trauma-informed care. Thematic analyses revealed that most participants reported the same IOF constructs (i.e., acceptability, appropriateness, feasibility) as both facilitators and barriers to adopting trauma-informed interventions. Implementation constructs across all CFIR domains (i.e., intervention characteristics, outer setting, inner setting, characteristics of individuals, process) were commonly identified as influencing implementation success within rural schools. These results have the capacity to direct the selection of implementation strategies to enhance the adoption and implementation of trauma-focused EBPs within schools, thereby increasing the accessibility of trauma-focused care in rural areas

    PSYX 100S.50: Introduction to Psychology

    Get PDF

    PSYX 100S.50: Introduction to Psychology - Online

    Get PDF

    Examining the Feasibility of a Rural School-Family Initiative

    Get PDF
    Increased prevalence of child psychological difficulties demonstrates a need for feasible mental health interventions that are available to children and families. Previous research shows evidence for the effectiveness of family-focused, school-based mental health programs in addressing child academic and behavioral problems. However, various barriers exist that prevent such programs from being implemented with fidelity: ability to identify high-risk children and families; school staff and caregiver attitudes, motivation, and satisfaction regarding use of the program; and program costs. The current study examined the feasibility of a rural school-family initiative that contained aspects of the Positive Family Support (PFS) program, including an examination of the previously listed implementation barriers. Participants included administrators, mental health support staff, teachers, and caregivers (e.g., parents) who are involved in implementation of PFS in a public middle school. Participants completed measures developed to assess attitudes, motivation, and satisfaction regarding use of PFS. Additionally, participants who were willing to complete a follow-up interview were asked specific questions regarding their involvement in and perceptions of PFS. Results suggested that the PFS program is not feasible within the target school setting, though the school was able to use aspects of the PFS program to develop a school-family initiative that appeared to positively impact participants’ perceptions of school-family partnerships. Results showed that participants held attitudes that compliment PFS program goals, as well as generally positive perceptions that the school was able and motivated to implement the school-family initiative. Qualitative interview results provided insight into the barriers that prevented the school from implementing all aspects of the PFS program. The current study contributes to the field by initiating dissemination and implementation research examining the effectiveness and sustainability of PFS in a public middle school setting

    Toward criteria for pragmatic measurement in implementation research and practice: a stakeholder-driven approach using concept mapping

    Get PDF
    Background: Advancing implementation research and practice requires valid and reliable measures of implementation determinants, mechanisms, processes, strategies, and outcomes. However, researchers and implementation stakeholders are unlikely to use measures if they are not also pragmatic. The purpose of this study was to establish a stakeholder-driven conceptualization of the domains that comprise the pragmatic measure construct. It built upon a systematic review of the literature and semi-structured stakeholder interviews that generated 47 criteria for pragmatic measures, and aimed to further refine that set of criteria by identifying conceptually distinct categories of the pragmatic measure construct and providing quantitative ratings of the criteria’s clarity and importance. Methods: Twenty-four stakeholders with expertise in implementation practice completed a concept mapping activity wherein they organized the initial list of 47 criteria into conceptually distinct categories and rated their clarity and importance. Multidimensional scaling, hierarchical cluster analysis, and descriptive statistics were used to analyze the data. Findings: The 47 criteria were meaningfully grouped into four distinct categories: (1) acceptable, (2) compatible, (3) easy, and (4) useful. Average ratings of clarity and importance at the category and individual criteria level will be presented. Conclusions: This study advances the field of implementation science and practice by providing clear and conceptually distinct domains of the pragmatic measure construct. Next steps will include a Delphi process to develop consensus on the most important criteria and the development of quantifiable pragmatic rating criteria that can be used to assess measures

    Operationalizing the ‘pragmatic’ measures construct using a stakeholder feedback and a multi-method approach

    Get PDF
    Abstract Context Implementation science measures are rarely used by stakeholders to inform and enhance clinical program change. Little is known about what makes implementation measures pragmatic (i.e., practical) for use in community settings; thus, the present study’s objective was to generate a clinical stakeholder-driven operationalization of a pragmatic measures construct. Evidence acquisition The pragmatic measures construct was defined using: 1) a systematic literature review to identify dimensions of the construct using PsycINFO and PubMed databases, and 2) interviews with an international stakeholder panel (N = 7) who were asked about their perspectives of pragmatic measures. Evidence synthesis Combined results from the systematic literature review and stakeholder interviews revealed a final list of 47 short statements (e.g., feasible, low cost, brief) describing pragmatic measures, which will allow for the development of a rigorous, stakeholder-driven conceptualization of the pragmatic measures construct. Conclusions Results revealed significant overlap between terms related to the pragmatic construct in the existing literature and stakeholder interviews. However, a number of terms were unique to each methodology. This underscores the importance of understanding stakeholder perspectives of criteria measuring the pragmatic construct. These results will be used to inform future phases of the project where stakeholders will determine the relative importance and clarity of each dimension of the pragmatic construct, as well as their priorities for the pragmatic dimensions. Taken together, these results will be incorporated into a pragmatic rating system for existing implementation science measures to support implementation science and practice

    An updated protocol for a systematic review of implementation-related measures

    Get PDF
    Abstract Background Implementation science is the study of strategies used to integrate evidence-based practices into real-world settings (Eccles and Mittman, Implement Sci. 1(1):1, 2006). Central to the identification of replicable, feasible, and effective implementation strategies is the ability to assess the impact of contextual constructs and intervention characteristics that may influence implementation, but several measurement issues make this work quite difficult. For instance, it is unclear which constructs have no measures and which measures have any evidence of psychometric properties like reliability and validity. As part of a larger set of studies to advance implementation science measurement (Lewis et al., Implement Sci. 10:102, 2015), we will complete systematic reviews of measures that map onto the Consolidated Framework for Implementation Research (Damschroder et al., Implement Sci. 4:50, 2009) and the Implementation Outcomes Framework (Proctor et al., Adm Policy Ment Health. 38(2):65-76, 2011), the protocol for which is described in this manuscript. Methods Our primary databases will be PubMed and Embase. Our search strings will be comprised of five levels: (1) the outcome or construct term; (2) terms for measure; (3) terms for evidence-based practice; (4) terms for implementation; and (5) terms for mental health. Two trained research specialists will independently review all titles and abstracts followed by full-text review for inclusion. The research specialists will then conduct measure-forward searches using the “cited by” function to identify all published empirical studies using each measure. The measure and associated publications will be compiled in a packet for data extraction. Data relevant to our Psychometric and Pragmatic Evidence Rating Scale (PAPERS) will be independently extracted and then rated using a worst score counts methodology reflecting “poor” to “excellent” evidence. Discussion We will build a centralized, accessible, searchable repository through which researchers, practitioners, and other stakeholders can identify psychometrically and pragmatically strong measures of implementation contexts, processes, and outcomes. By facilitating the employment of psychometrically and pragmatically strong measures identified through this systematic review, the repository would enhance the cumulativeness, reproducibility, and applicability of research findings in the rapidly growing field of implementation science

    Proceedings of the 3rd Biennial Conference of the Society for Implementation Research Collaboration (SIRC) 2015: advancing efficient methodologies through community partnerships and team science

    Get PDF
    It is well documented that the majority of adults, children and families in need of evidence-based behavioral health interventionsi do not receive them [1, 2] and that few robust empirically supported methods for implementing evidence-based practices (EBPs) exist. The Society for Implementation Research Collaboration (SIRC) represents a burgeoning effort to advance the innovation and rigor of implementation research and is uniquely focused on bringing together researchers and stakeholders committed to evaluating the implementation of complex evidence-based behavioral health interventions. Through its diverse activities and membership, SIRC aims to foster the promise of implementation research to better serve the behavioral health needs of the population by identifying rigorous, relevant, and efficient strategies that successfully transfer scientific evidence to clinical knowledge for use in real world settings [3]. SIRC began as a National Institute of Mental Health (NIMH)-funded conference series in 2010 (previously titled the “Seattle Implementation Research Conference”; $150,000 USD for 3 conferences in 2011, 2013, and 2015) with the recognition that there were multiple researchers and stakeholdersi working in parallel on innovative implementation science projects in behavioral health, but that formal channels for communicating and collaborating with one another were relatively unavailable. There was a significant need for a forum within which implementation researchers and stakeholders could learn from one another, refine approaches to science and practice, and develop an implementation research agenda using common measures, methods, and research principles to improve both the frequency and quality with which behavioral health treatment implementation is evaluated. SIRC’s membership growth is a testament to this identified need with more than 1000 members from 2011 to the present.ii SIRC’s primary objectives are to: (1) foster communication and collaboration across diverse groups, including implementation researchers, intermediariesi, as well as community stakeholders (SIRC uses the term “EBP champions” for these groups) – and to do so across multiple career levels (e.g., students, early career faculty, established investigators); and (2) enhance and disseminate rigorous measures and methodologies for implementing EBPs and evaluating EBP implementation efforts. These objectives are well aligned with Glasgow and colleagues’ [4] five core tenets deemed critical for advancing implementation science: collaboration, efficiency and speed, rigor and relevance, improved capacity, and cumulative knowledge. SIRC advances these objectives and tenets through in-person conferences, which bring together multidisciplinary implementation researchers and those implementing evidence-based behavioral health interventions in the community to share their work and create professional connections and collaborations

    Improving Educator\u27s Understanding of Rural Children\u27s Mental Health

    No full text
    corecore