The importance of feedback as an aid to self‐assessment is widely acknowledged. A common form of feedback that is used widely in e‐learning is the use of model answers. However, model answers are deficient in many respects. In particular, the notion of a ‘model’ answer implies the existence of a single correct answer applicable across multiple contexts with no scope for permissible variation. This reductive assumption is rarely the case with complex problems that are supposed to test students’ higher‐order learning. Nevertheless, the challenge remains of how to support students as they assess their own performance using model answers and other forms of non‐verificational ‘feedback’. To explore this challenge, the research investigated a management development e‐learning application and investigated the effectiveness of model answers that followed problem‐based questions. The research was exploratory, using semi‐structured interviews with 29 adult learners employed in a global organisation. Given interviewees’ generally negative perceptions of the model‐answers, they were asked to describe their ideal form of self‐assessment materials, and to evaluate nine alternative designs. The results suggest that, as support for higher‐order learning, self‐assessment materials that merely present an idealised model answer are inadequate. As alternatives, learners preferred materials that helped them understand what behaviours to avoid (and not just ‘do’), how to think through the problem (i.e. critical thinking skills), and the key issues that provide a framework for thinking. These findings have broader relevance within higher education, particularly in postgraduate programmes for business students where the importance of prior business experience is emphasised and the profile of students is similar to that of the participants in this research
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.