6 research outputs found

    Diffusion of excellence: evaluating a system to identify, replicate, and spread promising innovative practices across the Veterans health administration

    Get PDF
    IntroductionThe Veterans Health Administration (VHA) Diffusion of Excellence (DoE) program provides a system to identify, replicate, and spread promising practices across the largest integrated healthcare system in the United States. DoE identifies innovations that have been successfully implemented in the VHA through a Shark Tank style competition. VHA facility and regional directors bid resources needed to replicate promising practices. Winning facilities/regions receive external facilitation to aid in replication/implementation over the course of a year. DoE staff then support diffusion of successful practices across the nationwide VHA.MethodsOrganized around the Reach, Effectiveness, Adoption, Implementation, and Maintenance (RE-AIM) Framework, we summarize results of an ongoing long-term mixed-methods implementation evaluation of DoE. Data sources include: Shark Tank application and bid details, tracking practice adoptions through a Diffusion Marketplace, characteristics of VHA facilities, focus groups with Shark Tank bidders, structured observations of DoE events, surveys of DoE program participants, and semi-structured interviews of national VHA program office leaders, VHA healthcare system/facility executives, practice developers, implementation teams and facilitators.ResultsIn the first eight Shark Tanks (2016–2022), 3,280 Shark Tank applications were submitted; 88 were designated DoE Promising Practices (i.e., practices receive facilitated replication). DoE has effectively spread practices across the VHA, with 1,440 documented instances of adoption/replication of practices across the VHA. This includes 180 adoptions/replications in facilities located in rural areas. Leadership decisions to adopt innovations are often based on big picture considerations such as constituency support and linkage to organizational goals. DoE Promising Practices that have the greatest national spread have been successfully replicated at new sites during the facilitated replication process, have close partnerships with VHA national program offices, and tend to be less expensive to implement. Two indicators of sustainment indicate that 56 of the 88 Promising Practices are still being diffused across the VHA; 56% of facilities originally replicating the practices have sustained them, even up to 6 years after the first Shark Tank.ConclusionDoE has developed a sustainable process for the identification, replication, and spread of promising practices as part of a learning health system committed to providing equitable access to high quality care

    Implementation science: Helping healthcare systems improve

    No full text
    Using current evidence in practice is critical for phy-sician assistants (PAs). During training and after, PAs focus on learning the process and practice of evidence-based medicine. But despite the focus on evidence-based medicine, fewer than 20% of original research findings are translated into practice to benefit patients.1Research findings that do get adopted take about 17 years to become common clinical practice. Implementation science is the research field that focuses on shortening the time between evidence generation and service delivery; knowledge of implementation science is therefore essential for those interested in improving healthcare delivery

    Balancing reality in embedded research and evaluation: Low vs high embeddedness

    Full text link
    Embedding research and evaluation into organizations is one way to generate “practice-based” evidence needed to accelerate implementation of evidence-based innovations within learning health systems. Organizations and researchers/evaluators vary greatly in how they structure and operationalize these collaborations. One key aspect is the degree of embeddedness: from low embeddedness where researchers/evaluators are located outside organizations (eg, outside evaluation consultants) to high embeddedness where researchers/evaluators are employed by organizations and thus more deeply involved in program evolution and operations. Pros and cons related to the degree of embeddedness (low vs high) must be balanced when developing these relationships. We reflect on this process within the context of an embedded, mixed-methods evaluation of the Veterans Health Administration (VHA) Diffusion of Excellence (DoE) program. Considerations that must be balanced include: (a) low vs high alignment of goals; (b) low vs high involvement in strategic planning; (c) observing what is happening vs being integrally involved with programmatic activities; (d) reporting findings at the project’s end vs providing iterative findings and recommendations that contribute to program evolution; and (e) adhering to predetermined aims vs adapting aims in response to evolving partner needs.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/172341/1/lrh210294.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/172341/2/lrh210294_am.pd

    Rapid versus traditional qualitative analysis using the Consolidated Framework for Implementation Research (CFIR)

    Get PDF
    BACKGROUND: Qualitative approaches, alone or in mixed methods, are prominent within implementation science. However, traditional qualitative approaches are resource intensive, which has led to the development of rapid qualitative approaches. Published rapid approaches are often inductive in nature and rely on transcripts of interviews. We describe a deductive rapid analysis approach using the Consolidated Framework for Implementation Research (CFIR) that uses notes and audio recordings. This paper compares our rapid versus traditional deductive CFIR approach. METHODS: Semi-structured interviews were conducted for two cohorts of the Veterans Health Administration (VHA) Diffusion of Excellence (DoE). The CFIR guided data collection and analysis. In cohort A, we used our traditional CFIR-based deductive analysis approach (directed content analysis), where two analysts completed independent in-depth manual coding of interview transcripts using qualitative software. In cohort B, we used our new rapid CFIR-based deductive analysis approach (directed content analysis), where the primary analyst wrote detailed notes during interviews and immediately coded notes into a MS Excel CFIR construct by facility matrix; a secondary analyst then listened to audio recordings and edited the matrix. We tracked time for our traditional and rapid deductive CFIR approaches using a spreadsheet and captured transcription costs from invoices. We retrospectively compared our approaches in terms of effectiveness and rigor. RESULTS: Cohorts A and B were similar in terms of the amount of data collected. However, our rapid deductive CFIR approach required 409.5 analyst hours compared to 683 h during the traditional deductive CFIR approach. The rapid deductive approach eliminated $7250 in transcription costs. The facility-level analysis phase provided the greatest savings: 14 h/facility for the traditional analysis versus 3.92 h/facility for the rapid analysis. Data interpretation required the same number of hours for both approaches. CONCLUSION: Our rapid deductive CFIR approach was less time intensive and eliminated transcription costs, yet effective in meeting evaluation objectives and establishing rigor. Researchers should consider the following when employing our approach: (1) team expertise in the CFIR and qualitative methods, (2) level of detail needed to meet project aims, (3) mode of data to analyze, and (4) advantages and disadvantages of using the CFIR

    The Veterans Health Administration (VHA) Innovators Network: Evaluation design, methods and lessons learned through an embedded research approach

    Get PDF
    BACKGROUND: Collaboration between researchers, implementers and policymakers improves uptake of health systems research. In 2018, researchers and VHA Innovators Network (iNET) leadership used an embedded research model to conduct an evaluation of iNET. We describe our evaluation design, early results, and lessons learned. METHODS: This mixed-methods evaluation incorporated primary data collection via electronic survey, descriptive analysis using existing VA datasets (examining associations between facility characteristics and iNET participation), and qualitative interviews to support real-time program implementation and to probe perceived impacts, benefits and challenges of participation. RESULTS: We developed reporting tools and collected data regarding site participation, providing iNET leadership rapid access to needed information on projects (e.g., target populations reached, milestones achieved, and barriers encountered). Secondary data analyses indicated iNET membership was greater among larger, more complex VA facilities. Of the 37 iNET member sites, over half (n = 22) did not have any of the six major types of VA research centers; thus iNET is supporting VA sites not traditionally served by research innovation pathways. Qualitative findings highlighted enhanced engagement and perceived value of social and informational networks. CONCLUSIONS: Working alongside our iNET partners, we supported and influenced iNET\u27s development through our embedded evaluation\u27s preliminary findings. We also provided training and guidance aimed at building capacity among iNET participants. IMPLICATIONS: Embedded research can yield successful collaborative efforts between researchers and partners. An embedded research team can help programs pivot to ensure effective use of limited resources. Such models inform program development and expansion, supporting strategic planning and demonstrating value

    Factors Associated With Having a Physician, Nurse Practitioner, or Physician Assistant as Primary Care Provider for Veterans With Diabetes Mellitus

    No full text
    Expanded use of nurse practitioners (NPs) and physician assistants (PAs) is a potential solution to workforce issues, but little is known about how NPs and PAs can best be used. Our study examines whether medical and social complexity of patients is associated with whether their primary care provider (PCP) type is a physician, NP, or PA. In this national retrospective cohort study, we use 2012-2013 national Veterans Administration (VA) electronic health record data from 374 223 veterans to examine whether PCP type is associated with patient, clinic, and state-level factors representing medical and social complexity, adjusting for all variables simultaneously using a generalized logit model. Results indicate that patients with physician PCPs are modestly more medically complex than those with NP or PA PCPs. For the group having a Diagnostic Cost Group (DCG) score >2.0 compared with the group having DCG <0.5, odds of having an NP or a PA were lower than for having a physician PCP (NP odds ratio [OR] = 0.83, 95% confidence interval [CI]: 0.79-0.88; PA OR = 0.85, CI: 0.80-0.89). Social complexity is not consistently associated with PCP type. Overall, we found minor differences in provider type assignment. This study improves on previous work by using a large national dataset that accurately ascribes the work of NPs and PAs, analyzing at the patient level, analyzing NPs and PAs separately, and addressing social as well as medical complexity. This is a requisite step toward studies that compare patient outcomes by provider type
    corecore