The University of Edinburgh. College of Medicine and Veterinary Medicine. Directorate of Undergraduate Learning and Teaching
Abstract
The use of technology-supported teaching and learning in higher education has moved from a position
of peripheral interest a few years ago to become a fundamental ingredient in the experience of many if
not most students today. A major part of that change has been wrought by the widespread introduction
and use of ‘virtual learning environments’ (VLEs). A defining characteristic of VLEs is that they
combine a variety of tools and resources into a single integrated system. To use a VLE is not just to
employ a single intervention but to change the very fabric of the students’ experience of study and the
university. Despite this, much of the literature on VLEs has concentrated on producing typologies by
listing and comparing system functions, describing small scale and short duration applications or
providing speculative theories and predictions. Little attention has so far been paid to analysing what
effects a VLE’s use has on the participants and the context of use, particularly across a large group of
users and over a substantial period of time.
This work presents the evaluation of a VLE developed and used to support undergraduate medical
education at the University of Edinburgh since 1999. This system is called ‘EEMeC’ and was
developed specifically within and in support of its context of use. EEMeC provides a large number of
features and functions to many different kinds of user, it has evolved continuously since it was
introduced and it has had a significant impact on teaching and learning in the undergraduate medical
degree programme (MBChB). In such circumstances evaluation methodologies that depend on
controls and single variables are nether applicable or practical.
In order to approach the task of evaluating such a complex entity a multi-modal evaluation framework
has been developed based on taking a series of metaphor-informed perspectives derived from the
organisational theories of Gareth Morgan(Morgan 1997). The framework takes seven approaches to
evaluation of EEMeC covering a range of quantitative and qualitative methodologies. These are
combined in a dialectical analysis of EEMeC from these different evaluation perspectives.
This work provides a detailed and multi-faceted account of a VLE-in-use and the ways in which it
interacts with its user community in its context of use. Furthermore, the method of taking different
metaphor-based evaluation perspectives of a complex problem space is presented as a viable approach
for studying and evaluating similar learning support systems. The evaluation framework that has been
developed would be particularly useful to those practitioners who have a pressing and practical need
for meaningful evaluation techniques to inform and shape how complex systems such as VLEs are
deployed and used. As such, this work can provide insights not just into EEMeC, but into the way
VLEs are changing the environments and contexts in which they are used across the tertiary sector as
a whole