4 research outputs found
Serious Game Evaluation as a Meta-game
Purpose â This paper aims to briefly outline the seamless evaluation approach and its application during an evaluation of ORIENT, a serious game aimed at young adults.
Design/methodology/approach â In this paper, the authors detail a unobtrusive, embedded evaluation approach that occurs within the game context, adding value and entertainment to the player experience whilst accumulating useful data for the development team.
Findings â The key result from this study was that during the âseamless evaluationâ approach, users were unaware that they had been participating in an evaluation, with instruments enhancing rather than detracting from the in-role game experience.
Practical implications â This approach, seamless evaluation, was devised in response to player expectations, perspectives and requirements, recognising that in the evaluation of games the whole process of interaction including its evaluation must be enjoyable and fun for the user.
Originality/value â Through using seamless evaluation, the authors created an evaluation completely embedded within the âmagic circleâ of an in-game experience that added value to the user experience whilst also yielding relevant results for the development team
Werewolves, cheats, and cultural sensitivity
This paper discusses the design and evaluation of the system MIXER (Moderating Interactions for Cross-Cultural Empathic Relationships), which applies a novel approach to the education of children in cultural sensitivity. MIXER incorporates intelligent affective and interactive characters, including a model of a Theory of Mind mechanism, in a simulated virtual world. We discuss the relevant pedagogical approaches, related work, the underlying mind model used for MIXER agents as well as its innovative interaction interface utilising a tablet computer and a pictorial interaction language. We then consider the evaluation of the system, whether this shows it met its pedagogical objectives, and what can be learned from our results.</p
Enhancing Questionnaire Design Through Participant Engagement to Improve the Outputs of Evaluation.
Questionnaires are habitual choices for many user experience evaluators,
providing a well-recognised and accepted, fast and cost effective method of
collecting and analysing data. However, despite frequent and widespread use
in evaluation, reliance on questionnaires can be problematic. Satisficing,
acquiescence bias and straight lining are common response biases
associated with questionnaires, typically resulting in suboptimal responses
and provision of poor quality data. These problems can relate to a lack of
engagement with evaluation tasks, yet there is a lack of previous research
that has attempted to alleviate these limitations by making questionnaires
more fun or enjoyable to enhance participant engagement.
This research seeks to address whether âuser evaluation questionnaires can
be designed to be engaging to improve optimal responding. The aim of this
research is to investigate if response quality can be improved through
enhancing questionnaire design both to reduce common response biases and
to maintain participant engagement. The evaluation context for this study was
provided by MIXER, an interactive, narrative-based application for intercultural
sensitivity learning, used and evaluated by 9-11 year old children in the
classroom context.
A series of Participatory Design studies with children investigated
engagement and optimal responding with questionnaires. These initial studies
informed the design of a series of questionnaires created in the form of three
workbooks that were used to evaluate MIXER with over 400 children.
3
A mixed methods approach was used to evaluate the questionnaires. Results
demonstrate that by making questionnaire completion more enjoyable data
quality is improved. Response biases are reduced, quantitative data are more
complete and qualitative responses are more verbose and meaningful
compared to standard questionnaires. Further, children reported that
completing the questionnaires was a fun and enjoyable activity that they
would wish to repeat in the future.
As a discipline in its own right, evaluation is under-investigated. Similarly user
evaluation is not evaluated with a lack of papers considering this issue in this
millennium. Thus, this research provides a significant contribution to the field
of evaluation, highlighting that the outputs of user evaluation with
questionnaires are improved when participant engagement informs
questionnaire design. The result is a more positive evaluation experience for
participants and in return a higher standard of data provision for evaluators
and R&D teams