3 research outputs found

    Workplace-based assessment: effects of rater expertise

    Get PDF
    Traditional psychometric approaches towards assessment tend to focus exclusively on quantitative properties of assessment outcomes. This may limit more meaningful educational approaches towards workplace-based assessment (WBA). Cognition-based models of WBA argue that assessment outcomes are determined by cognitive processes by raters which are very similar to reasoning, judgment and decision making in professional domains such as medicine. The present study explores cognitive processes that underlie judgment and decision making by raters when observing performance in the clinical workplace. It specifically focuses on how differences in rating experience influence information processing by raters. Verbal protocol analysis was used to investigate how experienced and non-experienced raters select and use observational data to arrive at judgments and decisions about trainees’ performance in the clinical workplace. Differences between experienced and non-experienced raters were assessed with respect to time spent on information analysis and representation of trainee performance; performance scores; and information processing––using qualitative-based quantitative analysis of verbal data. Results showed expert-novice differences in time needed for representation of trainee performance, depending on complexity of the rating task. Experts paid more attention to situation-specific cues in the assessment context and they generated (significantly) more interpretations and fewer literal descriptions of observed behaviors. There were no significant differences in rating scores. Overall, our findings seemed to be consistent with other findings on expertise research, supporting theories underlying cognition-based models of assessment in the clinical workplace. Implications for WBA are discussed

    Workplace-Based Assessment Instruments in the Health Sciences

    No full text
    A historical overview of the development of assessment instruments in the health sciences is presented here, with specific attention paid to workplace-based assessment instruments. Three instruments are reviewed in detail: the mini clinical evaluation exercise (mCEX), direct observation of procedural skills (DOPS), and multi-source feedback (MSF). Features common to these instruments include their authenticity, their use in assessing professional skills, and the opportunities they afford for the provision of feedback. Although almost exclusively used in graduate medical training, they are likely to play an increasingly important role in the assessment of veterinary undergraduate students in preparation for professional practice. However, the time and cost associated with implementing these instruments raises questions about their feasibility. The continued search for the holy grail of assessment instruments and the challenges relating to the need for trained assessors leads us to conclude that ultimately, the competence of health professionals should continue to be measured using several complementary instruments
    corecore