75 research outputs found

    Expert recommendations for implementing change (ERIC): Protocol for a mixed methods study

    Get PDF
    BACKGROUND: Identifying feasible and effective implementation strategies that are contextually appropriate is a challenge for researchers and implementers, exacerbated by the lack of conceptual clarity surrounding terms and definitions for implementation strategies, as well as a literature that provides imperfect guidance regarding how one might select strategies for a given healthcare quality improvement effort. In this study, we will engage an Expert Panel comprising implementation scientists and mental health clinical managers to: establish consensus on a common nomenclature for implementation strategy terms, definitions and categories; and develop recommendations to enhance the match between implementation strategies selected to facilitate the use of evidence-based programs and the context of certain service settings, in this case the U.S. Department of Veterans Affairs (VA) mental health services. METHODS/DESIGN: This study will use purposive sampling to recruit an Expert Panel comprising implementation science experts and VA mental health clinical managers. A novel, four-stage sequential mixed methods design will be employed. During Stage 1, the Expert Panel will participate in a modified Delphi process in which a published taxonomy of implementation strategies will be used to establish consensus on terms and definitions for implementation strategies. In Stage 2, the panelists will complete a concept mapping task, which will yield conceptually distinct categories of implementation strategies as well as ratings of the feasibility and effectiveness of each strategy. Utilizing the common nomenclature developed in Stages 1 and 2, panelists will complete an innovative menu-based choice task in Stage 3 that involves matching implementation strategies to hypothetical implementation scenarios with varying contexts. This allows for quantitative characterizations of the relative necessity of each implementation strategy for a given scenario. In Stage 4, a live web-based facilitated expert recommendation process will be employed to establish expert recommendations about which implementations strategies are essential for each phase of implementation in each scenario. DISCUSSION: Using a novel method of selecting implementation strategies for use within specific contexts, this study contributes to our understanding of implementation science and practice by sharpening conceptual distinctions among a comprehensive collection of implementation strategies

    Building effective service linkages in primary mental health care: a narrative review part 2

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Primary care services have not generally been effective in meeting mental health care needs. There is evidence that collaboration between primary care and specialist mental health services can improve clinical and organisational outcomes. It is not clear however what factors enable or hinder effective collaboration. The objective of this study was to examine the factors that enable effective collaboration between specialist mental health services and primary mental health care.</p> <p>Methods</p> <p>A narrative and thematic review of English language papers published between 1998 and 2009. An expert reference group helped formulate strategies for policy makers. Studies of descriptive and qualitative design from Australia, New Zealand, UK, Europe, USA and Canada were included. Data were extracted on factors reported as enablers or barriers to development of service linkages. These were tabulated by theme at clinical and organisational levels and the inter-relationship between themes was explored.</p> <p>Results</p> <p>A thematic analysis of 30 papers found the most frequently cited group of factors was "partnership formation", specifically role clarity between health care workers. Other factor groups supporting clinical partnership formation were staff support, clinician attributes, clinic physical features and evaluation and feedback. At the organisational level a supportive institutional environment of leadership and change management was important. The expert reference group then proposed strategies for collaboration that would be seen as important, acceptable and feasible. Because of the variability of study types we did not exclude on quality and findings are weighted by the number of studies. Variability in local service contexts limits the generalisation of findings.</p> <p>Conclusion</p> <p>The findings provide a framework for health planners to develop effective service linkages in primary mental health care. Our expert reference group proposed five areas of strategy for policy makers that address organisational level support, joint clinical problem solving, local joint care guidelines, staff training and supervision and feedback.</p

    Predicting implementation from organizational readiness for change: a study protocol

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>There is widespread interest in measuring organizational readiness to implement evidence-based practices in clinical care. However, there are a number of challenges to validating organizational measures, including inferential bias arising from the halo effect and method bias - two threats to validity that, while well-documented by organizational scholars, are often ignored in health services research. We describe a protocol to comprehensively assess the psychometric properties of a previously developed survey, the Organizational Readiness to Change Assessment.</p> <p>Objectives</p> <p>Our objective is to conduct a comprehensive assessment of the psychometric properties of the Organizational Readiness to Change Assessment incorporating methods specifically to address threats from halo effect and method bias.</p> <p>Methods and Design</p> <p>We will conduct three sets of analyses using longitudinal, secondary data from four partner projects, each testing interventions to improve the implementation of an evidence-based clinical practice. Partner projects field the Organizational Readiness to Change Assessment at baseline (n = 208 respondents; 53 facilities), and prospectively assesses the degree to which the evidence-based practice is implemented. We will conduct predictive and concurrent validities using hierarchical linear modeling and multivariate regression, respectively. For predictive validity, the outcome is the change from baseline to follow-up in the use of the evidence-based practice. We will use intra-class correlations derived from hierarchical linear models to assess inter-rater reliability. Two partner projects will also field measures of job satisfaction for convergent and discriminant validity analyses, and will field Organizational Readiness to Change Assessment measures at follow-up for concurrent validity (n = 158 respondents; 33 facilities). Convergent and discriminant validities will test associations between organizational readiness and different aspects of job satisfaction: satisfaction with leadership, which should be highly correlated with readiness, versus satisfaction with salary, which should be less correlated with readiness. Content validity will be assessed using an expert panel and modified Delphi technique.</p> <p>Discussion</p> <p>We propose a comprehensive protocol for validating a survey instrument for assessing organizational readiness to change that specifically addresses key threats of bias related to halo effect, method bias and questions of construct validity that often go unexplored in research using measures of organizational constructs.</p

    Twenty-three unsolved problems in hydrology (UPH) – a community perspective

    Get PDF
    This paper is the outcome of a community initiative to identify major unsolved scientific problems in hydrology motivated by a need for stronger harmonisation of research efforts. The procedure involved a public consultation through on-line media, followed by two workshops through which a large number of potential science questions were collated, prioritised, and synthesised. In spite of the diversity of the participants (230 scientists in total), the process revealed much about community priorities and the state of our science: a preference for continuity in research questions rather than radical departures or redirections from past and current work. Questions remain focussed on process-based understanding of hydrological variability and causality at all space and time scales. Increased attention to environmental change drives a new emphasis on understanding how change propagates across interfaces within the hydrological system and across disciplinary boundaries. In particular, the expansion of the human footprint raises a new set of questions related to human interactions with nature and water cycle feedbacks in the context of complex water management problems. We hope that this reflection and synthesis of the 23 unsolved problems in hydrology will help guide research efforts for some years to come

    Proceedings of the 3rd Biennial Conference of the Society for Implementation Research Collaboration (SIRC) 2015: advancing efficient methodologies through community partnerships and team science

    Get PDF
    It is well documented that the majority of adults, children and families in need of evidence-based behavioral health interventionsi do not receive them [1, 2] and that few robust empirically supported methods for implementing evidence-based practices (EBPs) exist. The Society for Implementation Research Collaboration (SIRC) represents a burgeoning effort to advance the innovation and rigor of implementation research and is uniquely focused on bringing together researchers and stakeholders committed to evaluating the implementation of complex evidence-based behavioral health interventions. Through its diverse activities and membership, SIRC aims to foster the promise of implementation research to better serve the behavioral health needs of the population by identifying rigorous, relevant, and efficient strategies that successfully transfer scientific evidence to clinical knowledge for use in real world settings [3]. SIRC began as a National Institute of Mental Health (NIMH)-funded conference series in 2010 (previously titled the “Seattle Implementation Research Conference”; $150,000 USD for 3 conferences in 2011, 2013, and 2015) with the recognition that there were multiple researchers and stakeholdersi working in parallel on innovative implementation science projects in behavioral health, but that formal channels for communicating and collaborating with one another were relatively unavailable. There was a significant need for a forum within which implementation researchers and stakeholders could learn from one another, refine approaches to science and practice, and develop an implementation research agenda using common measures, methods, and research principles to improve both the frequency and quality with which behavioral health treatment implementation is evaluated. SIRC’s membership growth is a testament to this identified need with more than 1000 members from 2011 to the present.ii SIRC’s primary objectives are to: (1) foster communication and collaboration across diverse groups, including implementation researchers, intermediariesi, as well as community stakeholders (SIRC uses the term “EBP champions” for these groups) – and to do so across multiple career levels (e.g., students, early career faculty, established investigators); and (2) enhance and disseminate rigorous measures and methodologies for implementing EBPs and evaluating EBP implementation efforts. These objectives are well aligned with Glasgow and colleagues’ [4] five core tenets deemed critical for advancing implementation science: collaboration, efficiency and speed, rigor and relevance, improved capacity, and cumulative knowledge. SIRC advances these objectives and tenets through in-person conferences, which bring together multidisciplinary implementation researchers and those implementing evidence-based behavioral health interventions in the community to share their work and create professional connections and collaborations

    Implementation strategies

    No full text
    As the field of implementation science moves beyond studying barriers to and facilitators of implementation to the comparative effectiveness of different strategies, it is essential that we create a common taxonomy to define the strategies that we study. Similarly, we must clearly document the implementation strategies that are applied, the factors that influence their selection, and any adaptation of the strategy during the course of implementation and sustainment of the innovation being implemented. By incorporating this type of rigor into our work we will be able to not only advance the science of implementation but also our ability to place evidence-based innovations into the hands of practitioners in a timely and efficient manner
    • 

    corecore