8,760 research outputs found
A HCI principles based framework to assess the user perception of web based Virtual Research Environments. Special issue on Capacity building for post disaster infrastructure development and management
Due to various challenges and opportunities such as globalisation of research agenda and advancements in information and communication technologies, research collaborations (both international and national) have become popular during the last decade more than ever before. Within this context, the concept of Virtual Research environments(VRE) is an emerging concept looking at addressing the complex challenges associated with conducting collaborative research. Even though concept of VRE is at its infancy, it is important to assess user perception
about those, both to establish its success of uptake and future development strategies. However, to date, there is no formal method established to evaluate VREs .This paper reports a strategy adopted within an international collaborative research project (EURASIA) to evaluate
its custom built VRE, VEBER, using the well known Computer Human Interaction principles
Evaluating complex digital resources
Squires (1999) discussed the gap between HCI (Human Computer Interaction) and the educational computing communities in their very different approaches to evaluating educational software. This paper revisits that issue in the context of evaluating digital resources, focusing on two approaches to evaluation: an HCI and an educational perspective. Squires and Preece's HCI evaluation model is a predictive model â it helps teachers decide whether or not to use educational software â whilst our own concern is in evaluating the use of learning technologies. It is suggested that in part the different approaches of the two communities relate to the different focus that each takes: in HCI the focus is typically on development and hence usability, whilst in education the concern is with the learner and teacher use
Recommended from our members
Learner-Centred Design and Evaluation of Web-Based E-Learning Environments
Designing E-learning is a combination of pedagogical design, usability and information architecture. E-learning environments should have intuitive interfaces and clear information design, allowing learners to focus on learning. However, there is often a mismatch between what an on-line educator thinks the learner would learn, and what a learner thinks he will, and then has learned from the course. In addition, there is sometimes a mismatch between how an educator wants to teach and what is represented on the interface by the instructional designers. Such mismatches affect the learner's experience and his motivation for E-learning. In this paper, we will first discuss the source and nature of these mismatches. Next, we will discuss whether usability techniques in the HCI literature are appropriate for evaluating E-learning environments for the learner experience. We will then propose a combination of requirements elicitation and usability techniques for learner-centred design and evaluation of Web-based E-learning environments. The proposed methodology is based on our experience of conducting empirical studies for evaluating user-system interactions in E-Commerce contexts
A Comparison of Quantitative and Qualitative Data from a Formative Usability Evaluation of an Augmented Reality Learning Scenario
The proliferation of augmented reality (AR) technologies creates opportunities for the devel-opment of new learning scenarios. More recently, the advances in the design and implementation of desktop AR systems make it possible the deployment of such scenarios in primary and secondary schools. Usability evaluation is a precondition for the pedagogical effectiveness of these new technologies and requires a systematic approach for finding and fixing usability problems. In this paper we present an approach to a formative usability evaluation based on heuristic evaluation and user testing. The basic idea is to compare and integrate quantitative and qualitative measures in order to increase confidence in results and enhance the descriptive power of the usability evaluation report.augmented reality, multimodal interaction, e-learning, formative usability evaluation, user testing, heuristic evaluation
Recommended from our members
The evolution of a cooperative work framework for e-Learning
This paper details the evolution of a Framework for e-Learning, to a Cooperative Work Framework for e-Learning, as presented at the IASK conference (Graham 2008a) and annotated accordingly. It begins by discussing the development of the original Framework for e-Learning, and how this study resulted in a further study investigating whether the use of Blended Learning could fulfill or at least accommodate some of the human requirements presently neglected by current e-Learning systems as identified by the original Framework. This second study evaluated an in-house system: Teachmat, and discussed how the use of Blended Learning had become increasingly prevalent as a result of the enhancement and expansion of Teachmat. It looked at the employment of Blended Learning and Teachmatâs relationship to human and pedagogical issues, as well as both the positive and negative implications of this reality. PESTE factors from Sociology were then applied to appraise the adoption of e-Learning, leading to the proposal of PESTE factors for educational software and e-Learning in particular. Finally, the study evolved to reconsider e-Learning in relation to a Cooperative Work Framework, revealing critical weakness in the fundamental nature of e-Learning and its consequent propensity for failure
Mixed-methods research: a new approach to evaluating the motivation and satisfaction of university students using advanced visual technologies
The final publication is available at link.springer.comA mixed-methods study evaluating the motivation and satisfaction of Architecture degree students using interactive visualization methods is presented in this paper. New technology implementations in the teaching field have been largely extended to all types of levels and educational frameworks. However, these innovations require approval validation and evaluation by the final users, the students. In this paper, the advantages and disadvantages of applying mixed evaluation technology are discussed in a case study of the use of interactive and collaborative tools for the visualization of 3D architectonical models. The main objective was to evaluate Architecture and Building Science studentsâ the motivation to use and satisfaction with this type of technology and to obtain adequate feedback that allows for the optimization of this type of experiment in future iterations.Postprint (authorâs final draft
Recommended from our members
deep|think: A Second Life environment for part-time research students at a distance
This paper reports on the design of a Second Life campus for a new innovative post-graduate research programme at the Open University, UK, a world leader in supported distance higher education. The programme, launched in October 2009, is a part- time Master of Philosophy (MPhil) to be delivered at a distance, supported by a blend of synchronous and asynchronous Internet technologies. This paper briefly discusses the pedagogical thinking behind the Second Life campus, and the way the implementation was designed to meet the pedagogy. The paper also reports on the outcome of an early evaluation we have conducted
- âŠ