15 research outputs found

    Virtual OSCE Delivery and Quality Assurance During a Pandemic: Implications for the Future

    Get PDF
    Background: During 2020, the COVID-19 pandemic caused worldwide disruption to the delivery of clinical assessments, requiring medicals schools to rapidly adjust their design of established tools. Derived from the traditional face-to-face Objective Structured Clinical Examination (OSCE), the virtual OSCE (vOSCE) was delivered online, using a range of school-dependent designs. The quality of these new formats was evaluated remotely through virtual quality assurance (vQA). This study synthesizes the vOSCE and vQA experiences of stakeholders from participating Australian medical schools based on a Quality framework. Methods: This study utilized a descriptive phenomenological qualitative design. Focus group discussions (FGD) were held with 23 stakeholders, including examiners, academics, simulated patients, professional staff, students and quality assurance examiners. The data was analyzed using a theory-driven conceptual Quality framework. Results: The vOSCE was perceived as a relatively fit-for purpose assessment during pandemic physical distancing mandates. Additionally, the vOSCE was identified as being value-for-money and was noted to provide procedural benefits which lead to an enhanced experience for those involved. However, despite being largely delivered fault-free, the current designs are considered limited in the scope of skills they can assess, and thus do not meet the established quality of the traditional OSCE. Conclusions: Whilst virtual clinical assessments are limited in their scope of assessing clinical competency when compared with the traditional OSCE, their integration into programs of assessment does, in fact, have significant potential. Scholarly review of stakeholder experiences has elucidated quality aspects that can inform iterative improvements to the design and implementation of future vOSCEs

    Training our future doctors to deliver public health education

    Get PDF
    Delivering health education is, as we all know, done for the purpose of trying to encourage people, either individually or as a community, to change their behaviour with the intention of improving their health and well being and preventing certain lifestyle caused illnesses. We would all admit that changing behaviour is difficult and it often takes repeated health messages and much encouragement to effect behavioural change in a person or community. With the aging population and rise in preventable illnesses in our communities, health education has become the responsibility of all health professionals in all forms of contact with individuals or groups. Many studies have shown that doctors have enormous credibility in the eyes of many in the general public and this should be utilised to its maximum potential. Doctors often do not deliver health education as well as they could and it is necessary that we ask why not; and how could we better prepare our future doctors for this aspect of their work. Studies show we only deliver HE a third as often as we should

    Incorporating assessor expertise to improve workplace-based assessment efficacy

    No full text
    Introduction: The failure to fail, and elevated grades in Workplace-Based Assessment (WBA) are well-documented phenomena and assessors play a major role in this process. The resulting WBA appears to be of little real use to any of the stakeholders involved. What do assessors think is the problem and what do they think would improve the situation for them? If we develop and trial WBA designed to suit the assessor and use their expertise, will this result in WBA that is more useful for everyone? Summary of Methodology: To explore this question, a search of the literature was first conducted aiming to summarise understanding of the way clinical assessors make decisions regarding WBA results. The following research was conducted in the context of undergraduate medical student training and assessment in the workplace, at the University of Wollongong, Australia. To begin with, clinician assessors of medical students were surveyed regarding their attitudes to failing a student, and other difficulties they face in WBA. Following analysis of the initial survey, qualitative interviews of 16 experienced clinician assessors were conducted to further tease out survey responses. Suggestions from both the literature and the assessors were then used to design a WBA system intended to respect both the assessors’ expertise, and their relationship with their student. The new WBA trial incorporated the following changes: all WBA by the student’s preceptor was formative and focussed on feedback for learning; summative WBA grades were awarded by senior supervisors and were not delivered face to face to the student; grades included a ‘conditional pass’ grade allowing assessors to pass the student but define a problem for early remediation; assessment of developing skills was graded using clinical language of entrustment; and preceptors were asked to write confidential narrative observations on student performance for supervisors. WBA student results were compared before and after the new system was implemented. The opinions of all stakeholders (including students, assessors and the medical school) were also surveyed before and after the implementation of the new WBA for comparison. Results: Assessors claimed to have little trouble identifying an underperforming student, but said that to fail a student face to face was ‘simply too hard’. Assessment within the mentor relationship was then described as further compounding the difficulty, rendering it virtually impossible. The new WBA processes improved the hyper-inflated results. While the failure to fail remained a problem, there was good utilisation of the ‘conditional pass’ grade and preceptors felt their expertise was better utilised. Conclusions: Assessors identified the failure to fail as the fault of the system, not the assessor. Assessors said they were able and willing to identify a student with problems and assist with remediation. They saw the mentor relationship as so important for learning that they felt WBA within this relationship should be used to enhance learning. Training institutions need to rethink the value of pass/fail decisions in WBA, especially when made by the mentor, and when results are delivered face to face to the student

    Do doctors have a role in public health education

    No full text
    A study of people’s attitudes and responses to the presentation of health education by a doctor

    An OSCE clinical log station: driving reflection on clinical competence development

    No full text
    An OSCE clinical log station: driving reflection on clinical competence development Hudson JN, Rienits H, Graduate School of Medicine, University of Wollongong Background: An electronic clinical log was introduced at an Australian 4-year graduate entry medical school with the first intake of students in 2007. While some students embraced this log to record and reflect on their early clinical experiences, initial uptake of the log was low. Among the strategies used to encourage student use of this learning resource was the introduction of an innovative clinical log OSCE station. What was done: The clinical log station aimed to foster longitudinal recording and reflection on clinical experience and identification of significant learning issues in relation to patient and self-care, health promotion, teamwork and quality and safety. The scoring process sought evidence of educational use of the log. Demonstration of the marking sheet and standard setting procedure will illustrate how assessors scored performance using the following 3 main criteria: quantity and diversity of recorded experiences; case presentation; and reflection on development issues in relation to the presentation. Evaluation Findings: Student log use increased following introduction of the OSCE log station, but decreased following the final examinations. Student performance increased with experience of station expectations. Conclusions: While student ‘logging’ of clinical experience is important for quality assurance and to facilitate support of individual student development, assessment of this activity appears crucial to drive student engagement

    An innovative OSCE clinical log station: a quantitative study of its influence on Log use by medical students

    Get PDF
    Background A Clinical Log was introduced as part of a medical student learning portfolio, aiming to develop a habit of critical reflection while learning was taking place, and provide feedback to students and the institution on learning progress. It was designed as a longitudinal self-directed structured record of student learning events, with reflection on these for personal and professional development, and actions planned or taken for learning. As incentive was needed to encourage student engagement, an innovative Clinical Log station was introduced in the OSCE, an assessment format with established acceptance at the School. This study questions: How does an OSCE Clinical Log station influence Log use by students? Methods The Log station was introduced into the formative, and subsequent summative, OSCEs with careful attention to student and assessor training, marking rubrics and the standard setting procedure. The scoring process sought evidence of educational use of the log, and an ability to present and reflect on key learning issues in a concise and coherent manner. Results Analysis of the first cohort\u27s Log use over the four-year course (quantified as number of patient visits entered by all students) revealed limited initial use. Usage was stimulated after introduction of the Log station early in third year, with some improvement during the subsequent year-long integrated community-based clerkship. Student reflection, quantified by the mean number of characters in the \u27reflection\u27 fields per entry, peaked just prior to the final OSCE (mid-Year 4). Following this, very few students continued to enter and reflect on clinical experience using the Log. Conclusion While the current study suggested that we can\u27t assume students will self-reflect unless such an activity is included in an assessment, ongoing work has focused on building learner and faculty confidence in the value of self-reflection as part of being a competent physician

    Patient-directed clinical skills: valued by students

    Get PDF
    Abstract of an oral presentation at the ANZAHPE/AMEA 2015 Conference, 29-31 March, Newcastle, Australia

    Self-affirmation: Medical students develop healthy confidence by using video-analysis

    No full text
    Background Health professional education uses video recordings as self-analysis and reflection tool in clinical skill and communication skill development. Described benefits encompass better skill development, greater skill retention, enhanced ability to self-assess and reduced need for faculty input in the learning process. Summary of work We offered year 1 and year 2 medical students who showed deficits in their skills development, video-analysis of their skill performances as an additional remediation tool. The students were free to analyse the video on their own, with a peer or with a teacher. We collected qualitative data from consenting students and conducted a thematic analysis on ten semi-structured interviews. Summary of results The results confirmed findings from previous research, but also revealed self-affirmation as a significant theme. Students identified this confirmation of achievement as a valuable confounder to reduce assessment anxiety and stress. Conclusion Video self-analysis is a useful tool to affirm students\u27 skill and to build early confidence, provided clear process guidelines are available to the students. Take home messages Students who lack confidence may use video-analysis of skills in exam preparation and for remediation work to control their assessment anxiety and stress

    An innovative OSCE clinical log station: a quantitative study of its influence on Log use by medical students

    No full text
    Background: A Clinical Log was introduced as part of a medical student learning portfolio, aiming to develop a habit of critical reflection while learning was taking place, and provide feedback to students and the institution on learning progress. It was designed as a longitudinal self-directed structured record of student learning events, with reflection on these for personal and professional development, and actions planned or taken for learning. As incentive was needed to encourage student engagement, an innovative Clinical Log station was introduced in the OSCE, an assessment format with established acceptance at the School. This study questions: How does an OSCE Clinical Log station influence Log use by students? Methods: The Log station was introduced into the formative, and subsequent summative, OSCEs with careful attention to student and assessor training, marking rubrics and the standard setting procedure. The scoring process sought evidence of educational use of the log, and an ability to present and reflect on key learning issues in a concise and coherent manner. Results: Analysis of the first cohort's Log use over the four-year course (quantified as number of patient visits entered by all students) revealed limited initial use. Usage was stimulated after introduction of the Log station early in third year, with some improvement during the subsequent year-long integrated community-based clerkship. Student reflection, quantified by the mean number of characters in the 'reflection' fields per entry, peaked just prior to the final OSCE (mid-Year 4). Following this, very few students continued to enter and reflect on clinical experience using the Log. Conclusion: While the current study suggested that we can't assume students will self-reflect unless such an activity is included in an assessment, ongoing work has focused on building learner and faculty confidence in the value of self-reflection as part of being a competent physician
    corecore