22 research outputs found

    Potential Unintended Consequences Due to Medicare’s “No Pay for Errors Rule”? A Randomized Controlled Trial of an Educational Intervention with Internal Medicine Residents

    Get PDF
    Medicare has selected 10 hospital-acquired conditions for which it will not reimburse hospitals unless the condition was documented as “present on admission.” This “no pay for errors” rule may have a profound effect on the clinical practice of physicians. To determine how physicians might change their behavior after learning about the Medicare rule. We conducted a randomized trial of a brief educational intervention embedded in an online survey, using clinical vignettes to estimate behavioral changes. At a university-based internal medicine residency program, 168 internal medicine residents were eligible to participate. Residents were randomized to receive a one-page description of Medicare’s “no pay for errors” rule with pre-vignette reminders (intervention group) or no information (control group). Residents responded to five clinical vignettes in which “no pay for errors” conditions might be present on admission. Primary outcome was selection of the single most clinically appropriate option from three clinical practice choices presented for each clinical vignette. Survey administered from December 2008 to March 2009. There were 119 responses (71%). In four of five vignettes, the intervention group was less likely to select the most clinically appropriate response. This was statistically significant in two of the cases. Most residents were aware of the rule but not its impact and specifics. Residents acknowledged responsibility to know Medicare documentation rules but felt poorly trained to do so. Residents educated about the Medicare’s “no pay for errors” were less likely to select the most clinically appropriate responses to clinical vignettes. Such choices, if implemented in practice, have the potential for causing patient harm through unnecessary tests, procedures, and other interventions

    Using the Electronic Health Record to Identify Educational Gaps for Internal Medicine Interns

    No full text
    BackgroundAn important component of internal medicine residency is clinical immersion in core rotations to expose first-year residents to common diagnoses.ObjectiveQuantify intern experience with common diagnoses through clinical documentation in an electronic health record.MethodsWe analyzed all clinical notes written by postgraduate year (PGY) 1, PGY-2, and PGY-3 residents on medicine service at an academic medical center July 1, 2012, through June 30, 2014. We quantified the number of notes written by PGY-1s at 1 of 3 hospitals where they rotate, by the number of notes written about patients with a specific principal billing diagnosis, which we defined as diagnosis-days. We used the International Classification of Diseases 9 (ICD-9) and the Clinical Classification Software (CCS) to group the diagnoses.ResultsWe analyzed 53 066 clinical notes covering 10 022 hospitalizations with 1436 different ICD-9 diagnoses spanning 217 CCS diagnostic categories. The 10 most common ICD-9 diagnoses accounted for 23% of diagnosis-days, while the 10 most common CCS groupings accounted for more than 40% of the diagnosis-days. Of 122 PGY-1s, 107 (88%) spent at least 2 months on the service, and 3% were exposed to all of the top 10 ICD-9 diagnoses, while 31% had experience with fewer than 5 of the top 10 diagnoses. In addition, 17% of PGY-1s saw all top 10 CCS diagnoses, and 5% had exposure to fewer than 5 CCS diagnoses.ConclusionsAutomated detection of clinical experience may help programs review inpatient clinical experiences of PGY-1s

    Physical Examination Education in Graduate Medical Education—A Systematic Review of the Literature

    No full text
    ObjectivesThere is widespread recognition that physical examination (PE) should be taught in Graduate Medical Education (GME), but little is known regarding how to best teach PE to residents. Deliberate practice fosters expertise in other fields, but its utility in teaching PE is unknown. We systematically reviewed the literature to determine the effectiveness of methods to teach PE in GME, with attention to usage of deliberate practice.Data sourcesWe searched PubMed, ERIC, and EMBASE for English language studies regarding PE education in GME published between January 1951 and December 2012.Study eligibility criteriaSeven eligibility criteria were applied to studies of PE education: (1) English language; (2) subjects in GME; (3) description of study population; (4) description of intervention; (5) assessment of efficacy; (6) inclusion of control group; and (7) report of data analysis.Study appraisal and synthesis methodsWe extracted data regarding study quality, type of PE, study population, curricular features, use of deliberate practice, outcomes and assessment methods. Tabulated summaries of studies were reviewed for narrative synthesis.ResultsFourteen studies met inclusion criteria. The mean Medical Education Research Study Quality Instrument (MERSQI) score was 9.0 out of 18. Most studies (n = 8) included internal medicine residents. Half of the studies used resident interaction with a human examinee as the primary means of teaching PE. Three studies "definitely" and four studies "possibly" used deliberate practice; all but one of these studies demonstrated improved educational outcomes.LimitationsWe used a non-validated deliberate practice assessment. Given the heterogeneity of assessment modalities, we did not perform a meta-analysis.Conclusions and implications of key findingsNo single strategy for teaching PE in GME is clearly superior to another. Following the principles of deliberate practice and interaction with human examinees may be beneficial in teaching PE; controlled studies including these educational features should be performed to investigate these exploratory findings

    Development of a Multi-Domain Assessment Tool for Quality Improvement Projects.

    No full text
    BackgroundImproving the quality of health care and education has become a mandate at all levels within the medical profession. While several published quality improvement (QI) assessment tools exist, all have limitations in addressing the range of QI projects undertaken by learners in undergraduate medical education, graduate medical education, and continuing medical education.ObjectiveWe developed and validated a tool to assess QI projects with learner engagement across the educational continuum.MethodsAfter reviewing existing tools, we interviewed local faculty who taught QI to understand how learners were engaged and what these faculty wanted in an ideal assessment tool. We then developed a list of competencies associated with QI, established items linked to these competencies, revised the items using an iterative process, and collected validity evidence for the tool.ResultsThe resulting Multi-Domain Assessment of Quality Improvement Projects (MAQIP) rating tool contains 9 items, with criteria that may be completely fulfilled, partially fulfilled, or not fulfilled. Interrater reliability was 0.77. Untrained local faculty were able to use the tool with minimal guidance.ConclusionsThe MAQIP is a 9-item, user-friendly tool that can be used to assess QI projects at various stages and to provide formative and summative feedback to learners at all levels

    What Happened to My Patient? An Educational Intervention to Facilitate Postdischarge Patient Follow-Up

    No full text
    BackgroundFollowing up on patients' clinical courses after hospital discharge may enhance physicians' learning and care of future patients. Barriers to this practice for residents include time constraints, discontinuous training environments, and difficulty accessing patient information.ObjectiveWe designed an educational intervention facilitating informed self-assessment and reflection through structured postdischarge follow-up of patients' longitudinal clinical courses. We then examined the experience of interns who received this intervention in a mixed methods study.MethodsInternal medicine interns on a 4-week patient safety rotation received lists of hospitalized patients they had cared for earlier in the year. They selected patients for chart review and completed a guided reflection worksheet for each patient reviewed. Interns then discussed lessons learned in a faculty-led group debrief session.ResultsOf 62 eligible interns, 62 (100%) participated in this intervention and completed 293 reflection worksheets. We analyzed worksheets and transcripts from 6 debrief sessions. Interns reported that postdischarge patient follow-up was valuable for their professional development, and helped them understand the natural history of disease and patients' illness experiences. After reviewing their patients' clinical courses, interns stated that they would advocate for earlier end-of-life counseling, improve care transitions, and adjust their clinical decision-making for similar patients in the future.ConclusionsOur educational intervention created the time, space, and structure for postdischarge patient follow-up. It was well received by participants, and is an opportunity for experiential learning
    corecore