Location of Repository

Beyond model answers: learners’ perceptions of self-assessment materials in e-learning applications

By Karen Handley and Benita Cox

Abstract

The importance of feedback as an aid to self‐assessment is widely acknowledged. A common form of feedback that is used widely in e‐learning is the use of model answers. However, model answers are deficient in many respects. In particular, the notion of a ‘model’ answer implies the existence of a single correct answer applicable across multiple contexts with no scope for permissible variation. This reductive assumption is rarely the case with complex problems that are supposed to test students’ higher‐order learning. Nevertheless, the challenge remains of how to support students as they assess their own performance using model answers and other forms of non‐verificational ‘feedback’. To explore this challenge, the research investigated a management development e‐learning application and investigated the effectiveness of model answers that followed problem‐based questions. The research was exploratory, using semi‐structured interviews with 29 adult learners employed in a global organisation. Given interviewees’ generally negative perceptions of the model‐answers, they were asked to describe their ideal form of self‐assessment materials, and to evaluate nine alternative designs. The results suggest that, as support for higher‐order learning, self‐assessment materials that merely present an idealised model answer are inadequate. As alternatives, learners preferred materials that helped them understand what behaviours to avoid (and not just ‘do’), how to think through the problem (i.e. critical thinking skills), and the key issues that provide a framework for thinking. These findings have broader relevance within higher education, particularly in postgraduate programmes for business students where the importance of prior business experience is emphasised and the profile of students is similar to that of the participants in this research

Topics: LB Theory and practice of education, LC1022 - 1022.25 Computer-assisted Education
Publisher: Taylor and Francis Ltd
Year: 2007
DOI identifier: 10.1080/09687760601129539
OAI identifier: oai:generic.eprints.org:712/core5

Suggested articles

Preview

Citations

  1. (2005). A review of computer-assisted assessment, doi
  2. (1978). Accretion, tuning and restructuring: three modes of learning, in:
  3. (1987). Artificial intelligence and tutoring system doi
  4. (1987). Assessing students: how shall we know them? (2nd edn) doi
  5. (2003). Contextual issues in the construction of computer-based learning programs, doi
  6. (2004). Educational blogging,
  7. (1995). Engines for education (Hillsdale, NJ, Lawrence Erlbaum Associates).
  8. (1987). Epilogue, in: The mind’s new science: a history of the cognitive revolution (New York, BasicBooks) (Original text published
  9. (1995). Feedback and self-regulated learning: a theoretical Synthesis, doi
  10. (2000). Feedback for learning
  11. (1989). Feedback in written instruction—the place of response certitude, doi
  12. (2005). Formative assessment: a cybernetic viewpoint, Assessment doi
  13. (1970). Forms of intellectual and ethical development in the college years doi
  14. (1996). Interviews: an introduction to qualitative research interviewing (Thousand Oaks, CA, doi
  15. (1989). Lay epistemics and human knowledge: cognitive and motivational bases doi
  16. (1991). Learning through assessment, in:
  17. (2004). Managers not MBAs: a hard look at the soft practice of managing and management development (London, FT/Prentice Hall). doi
  18. (1978). Mind in society: the development of higher mental processes doi
  19. (1998). Multimedia and the learner’s experience of narrative, doi
  20. (1991). Opening mouths to change feet: some views on self and peer assessment, in:
  21. (1987). Process consultation: lessons for managers and consultants doi
  22. (1993). Rethinking university teaching: a framework for the effective use of educational technology (1st edn) doi
  23. (2002). Rethinking university teaching: a framework for the effective use of educational technology (2nd edn) doi
  24. (1991). Situated learning: legitimate peripheral participation (Cambridge, doi
  25. (1986). Social foundations of thought and action: a social cognitive theory (Englewood Cliffs,
  26. (1999). Survey research (London,
  27. (1958). Teaching machines, doi
  28. (1997). The black hole of cyberspace. Available online at: www.rider.edu/suler/psycyber/ psycyber.html (accessed 28
  29. (1932). The fundamentals of learning doi
  30. (2004). The future of rational–critical debate in online public spheres. Available online at: http://collegewriting.us/barton/Shared%20Documents/RevisedArticle.doc (accessed 17
  31. (1991). The instructional effect of feedback in test-like events, doi
  32. (1988). The nature of expertise (Hillsdale, NJ, Lawrence Erlbaum Associates).
  33. (2002). There’s no confidence in multiple-choice testing, in:
  34. (2003). To feedback or not to feedback in student self-assessment, doi
  35. (1998). Transforming qualitative data: thematic analysis and code development (Thousand Oaks, CA,
  36. (1974). Up to the mark: a study of the examination game,
  37. (2003). Using information technology in learning: case studies in business and management education programs, doi
  38. (2002). What does research say about the learners using computer-mediated communication in distance learning?, doi
  39. (2005). Wiki pedagogy. Available online at:

To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.