3 research outputs found
Inducing expertise effects in clinical case recall
BACKGROUND This study was directed at illuminating a well known phenomenon in the medical expertise literature, the 'intermediate effect' in clinical case recall. This robust phenomenon consists of the finding that medical students of intermediate levels of expertise outperform both experts and novices in clinical case recall after diagnosing cases. It deals in particular with the findings of OME researchers who have reported a monotonically increasing recall with level of expertise. PURPOSE To address possible causes for this anomaly in medical expertise and to experimentally demonstrate how data elaboration can cause expertise effects in clinical case recall. METHOD Expert nephrologists, intermediate level students and novices were presented with 6 medical cases under 3 different conditions: laboratory data cases without special instructions, laboratory data cases with instructions to elaborate, and cases with laboratory data and a relevant clinical context. RESULTS Only when participants were required to elaborate on each of the information units presented to them did case recall show an expertise effect. If laboratory data are framed within the context of a patient's history and physical examination data, the 'intermediate effect' appears. CONCLUSIONS The instructions used in the elaboration condition seem to have induced a deeper, more detailed, analysis of the patient case. It is therefore interesting to note that these instructions only affected the recall of the experts and had no effect on the novices' or intermediates' recall. We might conclude from this that expertise effects in clinical case recall are only produced when the normal processing of patient information is disrupted
Becoming familiar with competency-based student assessment: an evaluation of workshop outcomes
The identification and specification of competency-based standards in speech-language pathology has provided practitioners, educators, employers, and government regulators with information and guidance. This paper reports the outcomes of workshops that provided familiarization with the new competency-based assessment tool, COMPASS®, which was introduced for the assessment of speech-language pathology (SLP) students across all 13 SLP professional preparation programs in Australia during 2007. An anonymous evaluation was administered before and after the first eight familiarization workshops held nationally, involving 240 clinical educators. Quantitative data were analysed descriptively, and qualitative data were entered into NVivo qualitative analysis software for content analysis. Post-workshop, results indicated partial or full uptake of the main concepts involved in the new approach to assessment. Least uptake was observed for the need for direct observation of competence in workplace performance. Qualitatively, post-workshop, formative assessment was more apparent within student goals formulated in response to a hypothetical scenario. A possible contributor to this outcome is suggested to be the alignment between the tool and the professional community of practice, due to the collaborative process of its development. Research into the longer term impact of the new assessment in the context of everyday practice is suggested