33 research outputs found
CPR and ECC
This practical workshop will demonstrate the following Emergency and Critaickl Care techiques:
Cardioplumonary Resuscitation.
Thoracocentesis.
Intermittent Positive Pressure Ventilation.
Delegates will then be provided with the opportunity to practice the techniques themeselves on professional mannequins
The Cost of Components of a Fourth Year OSCE
The value of using Objective Structured Clinical Examinations (OSCEs) to assess clinical skills is widely accepted across medicine, nursing and veterinary programmes. The OSCE is seen as an extremely valuable tool which allows students to “show how” they would act in a clinical situation with good validity and reliability. Cost-effectiveness is one of the components of the assessment utility index, which defines the usefulness of an assessment as a product of its reliability, validity, cost-effectiveness, acceptability, feasibility and educational impact (van der Vleuten 1996). OSCEs are undoubtedly expensive for institutions who utilise them, especially as “high stakes” final exams. There is a lack of other assessment tools to assess clinical skills, so institutions should be encouraged to understand the costs associated to inform resource allocation and plan realistic economies.
Running an OSCE is similar to running a theatre production: significant costs are seen in the development, production, administration and post-production phases (Reznick et al. 1993). We used Reznick et al.’s (1993) four phases of the OSCE process for categorising data: development, production, administration and post-examination reporting and analysis. At the University of Glasgow School of Veterinary Medicine, multiple years have summative OSCEs, but we focused on the three-day, 15- station, summative fourth year OSCE for students on the BVMS 5-year programme. The poster will outline some of the more interesting and thought-provoking findings. Are OSCE costs modifiable without jeopardising reliability or validity? What potential cost-saving changes can we make?
Reznick R, Smee S, Baumber J, Cohen R, Rothman A, Blackmore D, Berard M. 1993. Guidelines for estimating the real cost of an objective structured clinical examination. Acad Med 68(7):513-517.
Van der Vleuten V. 1996. The assessment of professional competence: Developments, research and practical implications. Ad Health Sci Educ 1:41-67
Can an Evaluation of Students’ Stress Levels Help us Manage Anxiety During OSCEs and Other Assessment Modalities?
With an increased awareness of mental health issues, in both the student population and the veterinary profession in general, it is important that we obtain a greater understating of the stress experienced by students so as to better prepare them to deal with stress and ameliorate any negative effects it may have on performance.
This study aims to characterise various measurements of stress (e.g. HRV, EEG, cortisol, self-report questionnaire) in students within the School of Veterinary Medicine in familiar test modalities, focussing on OSCE assessment. We would also investigate how performance is impacted and what potential factors may influence stress levels. Ultimately, our aim would be to evaluate intervention strategies to assess if students stress levels and performance can be improved
How to Improve Your OSCEs
Glasgow Veterinary School has been successfully administering multiple formative and summative OSCEs for over 10 years. During this time, the introduction of a new curriculum and the considerable expansion of clinical skills teaching has led to an increased amount of time, staff experience, and training devoted to OSCEs. Glasgow’s Chief OSCE Examiners will be leading this workshop which is aimed at educators who are planning to introduce or expand OSCEs within their curriculum, or explore how other another institution runs their OSCEs.
The workshop would provide useful advice on the following:
Writing and Reviewing OSCE stations
Choosing suitable OSCE stations for your institution
How to choose and train assessors.
Setting up and running a multiple station OSCE
ILOs:
By the end of the workshop, participants should be able to:
Formulate an OSCE scenario
Design and Run an OSCE using multiple station
How to Run a Successful OSCE with Peer Assessors
How to run a successful OSCE with peer assessors
Glasgow Veterinary School administers multiple formative and summative OSCEs each year, costly in both time, and staff involvement. Clinical staff are reluctant to take time away from the hospitals to assess mock exams, so we recruit senior students to peer assess junior undergraduates during their formative OSCEs. In a recent internal study, >97% of Glasgow students highly rated the PAs’ ability to provide constructive feedback, believed they had been given helpful advice on improving future performance, and that they had been fairly assessed. Peer assessors believed that peer assessing would benefit them in their own OSCEs, because of their inside knowledge of the OSCE scenarios, and that they would feel more confident, and reported an increased understanding of the assessment process. Peer assessment of formative OSCEs has now been used successfully at Glasgow for several years.
The workshop would provide useful advice on the following:
How to choose your peer assessors
How to train your peer assessors
How to ensure peer assessment is accepted by your students
How to run an OSCE entirely with peer assessors
During the workshop we would plan to discuss the following questions:-
Why should we use peer assessors?
Who should they be?
How do we train peer assessors?
What do we include in the training?
Is it acceptable to use peer assessors in summative exams?
What are the Pros and Cons of peer assessment?
What are the benefits of peer assessment for staff, students and peer assessors
CPR and ECC
This practical workshop will demonstrate the following Emergency and Critaickl Care techiques:
Cardioplumonary Resuscitation.
Thoracocentesis.
Intermittent Positive Pressure Ventilation.
Delegates will then be provided with the opportunity to practice the techniques themeselves on professional mannequins
OSCEs – ‘O’ is for the overall stress that I feel!
Glasgow Vet School has been using the Objective Structured Clinical Examination to assess practical skills since 1994. Senior academic staff responsible for the co-ordination of the OSCEs in undergraduate years I to IV report anecdotally that students’ stress levels in the OSCE exams seem to be markedly higher than in more traditional written exams. In 2019, it was decided that station titles would be released the night before the OSCE with the aim of alleviating the stress and anxiety surrounding not knowing which skills would be assessed each day.
However, there has been only minimal student feedback to support that this had a positive impact on stress levels. In an effort to address this, the authors decided to investigate students’ perception of stress levels in different types of examination within the curriculum; the impact of releasing station titles; what factors during OSCEs contribute to stress and potential methods to mitigate this stress. In February 2023, during the BVMS IV summative OSCE, 141 fourth year undergraduates were asked to fill in a mixed methods questionnaire to explore OSCE stress. Findings reveal that students find the OSCE marginally more stressful than written examinations, and students are less stressed when they know the station titles of the OSCES the night before. Factors contributing to OSCE stress include the timing of revision sessions, the waiting room environment and assessor non-verbal communication. Suggestions for alleviating stress include implementing strategies to improve the exam waiting room environment, addressing assessor non-verbal communication and prioritising summative OSCE year groups: revision sessions, appropriate lecture scheduling, and formative training
A Cross-years Course Leader Collaboration to Improve Veterinary Students' Assessment and Feedback Literacy
Since 2022 course leaders (CLs) from across years 1-4 of the BVMS programme have collaborated to standardise and improve assessment and feedback practices. Given the spiral nature of the curriculum, where topics are revisited from the Foundation Phase (year 1 and 2) to Clinical Phase (year 3 and 4), it was determined that there should be greater consistency to assessment and feedback, and there were multifaceted reasons driving the need for advancement of the existing practices.
From the student perspective, a recurrent theme in student feedback was the limited nature of exam feedback and lack of clarity about how they could improve their exam performance. Also, following Covid-19 and the pivot to online examinations, students were unhappy they had lost the opportunity to review their paper exam scripts. For CLs, students appeared to be starting university with poorer assessment literacy and less effective study techniques, and many students were passing at the borderline level which suggested a need for better guidance on identifying knowledge gaps. Individualised post-exam support was also being offered to failing students only, which was deemed inequitable for those who were passing, but whose performance could still improve.
A collaborative approach by the CLs was deemed integral for adopting practices that were consistent, fair and sustainable, both within and between the years. The current challenges and differences across the programme were identified, and the following changes implemented:
- Assessment literacy training was embedded within all years.
- Standardised processes were developed for the provision of detailed individual and class assessment feedback.
- Online exam script review sessions were run, which adopted a cohort-wide rather than individual student approach.
To evaluate student perceptions of these changes, a mixed-methods survey study was administered in semester 2 to all students in years 2-4. Results from117 respondents, showed the majority utilised their individual (78.4%) and class (69.9%) feedback and found the level of detail provided to be useful. Just over half (61.2%) of the students agreed that the exam review sessions helped them to understand the marks they received, 59.5% thought it would help them to improve their performance in the next exam, and 66.4% said it encouraged them to identify material they needed to revise. When asked to describe what they could do as individuals to support their improved engagement or understanding of assessment feedback, themes included using feedback to target revision of past material, modifying approaches to study planning, engaging more with staff, academic support services and peers, and better utilisation of ILOs to plan revision. The survey will be administered annually to determine longitudinal changes in perceptions and behaviours.
These collaborative developments have established a cohesive programme approach to ensure that students have improved assessment literacy and are developing study techniques that can be effectively applied across all years
Professional and Clinical Experience (PACE): a Program for Developing Professional Practice Attributes
No abstract available
