12 research outputs found

    Providing feedback:exploring a model (emotion, content, outcomes) for facilitating multisource feedback

    No full text
    <b>Background</b> Multi-source feedback (MSF) aims to raise self-awareness of performance and encourage improvement. The ECO model (emotions, content, outcome) is a three-step process developed from the counselling literature to facilitate feedback acceptance and use in MSF.<p></p> <b>Aims</b> The purpose of this study was to explore the acceptability, usefulness and educational impact of the model.<p></p> <b>Methods</b> This was a qualitative study using interviews to explore general practice (GP) trainer and trainee experiences and perceptions of the ECO facilitation model. Interviews were conducted by telephone, recorded, transcribed and analysed using a thematic framework.<p></p> <b>Results</b> About 13 GP trainers and trainees participated in the interviews following their MSF discussions using the ECO model. They agreed that the model was useful, simple to use and engaged trainees in reflection upon their feedback and performance. Exploring emotions and clarifying content appeared integral to accepting and using the feedback. Positive feedback was often surprising. Most trainees reported performance improvements following their MSF-ECO session.<p></p> <b>Conclusions</b> The model appeared acceptable and simple to use. Engaging the learner as a partner in the feedback discussion appeared effective. Further research is needed to fully understand the influence of each step in facilitating MSF acceptance and use, and to determine the impact of the ECO model alone upon performance outcomes compared to more traditional provision of MSF feedback

    Enhancing the Effectiveness of Significant Event Analysis: Exploring Personal Impact and Applying Systems Thinking in Primary Care

    No full text
    Introduction:  Significant event analysis (SEA) is well established in many primary care settings but can be poorly implemented. Reasons include the emotional impact on clinicians and limited knowledge of systems thinking in establishing why events happen and formulating improvements. To enhance SEA effectiveness, we developed and tested “guiding tools” based on human factors principles.  Methods:  Mixed-methods development of guiding tools (Personal Booklet—to help with emotional demands and apply a human factors analysis at the individual level; Desk Pad—to guide a team-based systems analysis; and a written Report Format) by a multiprofessional “expert” group and testing with Scottish primary care practitioners who submitted completed enhanced SEA reports. Evaluation data were collected through questionnaire, telephone interviews, and thematic analysis of SEA reports.  Results:  Overall, 149/240 care practitioners tested the guiding tools and submitted completed SEA reports (62.1%). Reported understanding of how to undertake SEA improved postintervention (P< .001), while most agreed that the Personal Booklet was practical (88/123, 71.5%) and relevant to dealing with related emotions (93/123, 75.6%). The Desk Pad tool helped focus the SEA on systems issues (85/123, 69.1%), while most found the Report Format clear (94/123, 76.4%) and would recommend it (88/123, 71.5%). Most SEA reports adopted a systems approach to analyses (125/149, 83.9%), care improvement (74/149, 49.7), or planned actions (42/149, 28.2%).  Discussion:  Applying human factors principles to SEA potentially enables care teams to gain a systems-based understanding of why things go wrong, which may help with related emotional demands and with more effective learning and improvement.Additional co-authors: Suzanne Stirling, Judy Wakeling, Angela Inglis, John McKay, and Joan Sargean
    corecore