66,926 research outputs found

    Experience Evaluations for Human-Computer Co-Creative Processes : Planning and Conducting an Evaluation in Practice

    Get PDF
    In human–computer co-creativity, humans and creative computational algorithms create together. Too often, only the creative algorithms and their outcomes are evaluated when studying these co-creative processes, leaving the human participants to little attention. This paper presents a case study emphasising the human experiences when evaluating the use of a co-creative poetry writing system called the Poetry Machine. The co-creative process was evaluated using seven metrics: Fun, Enjoyment, Expressiveness, Outcome satisfaction, Collaboration, Ease of writing, and Ownership. The metrics were studied in a comparative setting using three co-creation processes: a human–computer, a human–human, and a human–human–computer co-creation process. Twelve pupils of age 10–11 attended the studies in six pairs trying out all the alternative writing processes. The study methods included observation in paired-user testing, questionnaires, and interview. The observations were complemented with analyses of the video recordings of the evaluation sessions. According to statistical analyses, Collaboration was the strongest in human–human–computer co-creation, and weakest in human–computer co-creation. Ownership was just the opposite: weakest in human–human–computer co-creation, and strongest in human–computer co-creation. Other metrics did not produce statistically significant results. In addition to the results, this paper presents the lessons learned in the evaluations with children using the selected methods.Peer reviewe

    A study of effective evaluation models and practices for technology supported physical learning spaces (JELS)

    Get PDF
    The aim of the JELS project was to identify and review the tools, methods and frameworks used to evaluate technology supported or enhanced physical learning spaces. A key objective was to develop the sector knowledgebase on innovation and emerging practice in the evaluation of learning spaces, identifying innovative methods and approaches beyond traditional post-occupancy evaluations and surveys that have dominated this area to date. The intention was that the frameworks and guidelines discovered or developed from this study could inform all stages of the process of implementing a technology supported physical learning space. The study was primarily targeted at the UK HE sector and the FE sector where appropriate, and ran from September 2008 to March 2009

    A human factors methodology for real-time support applications

    Get PDF
    A general approach to the human factors (HF) analysis of new or existing projects at NASA/Goddard is delineated. Because the methodology evolved from HF evaluations of the Mission Planning Terminal (MPT) and the Earth Radiation Budget Satellite Mission Operations Room (ERBS MOR), it is directed specifically to the HF analysis of real-time support applications. Major topics included for discussion are the process of establishing a working relationship between the Human Factors Group (HFG) and the project, orientation of HF analysts to the project, human factors analysis and review, and coordination with major cycles of system development. Sub-topics include specific areas for analysis and appropriate HF tools. Management support functions are outlined. References provide a guide to sources of further information

    Self-Evaluation in Youth Media and Technology Programs: A Report to the Time Warner Foundation

    Get PDF
    This 2003 report documents the self-evaluation practices, challenges, and concerns of the Time Warner Foundation's Community Grantees; reviews the resources available to youth media programs wishing to conduct program and outcome evaluations; and begins to identify useful directions for further exploration

    Harmonised Principles for Public Participation in Quality Assurance of Integrated Water Resources Modelling

    Get PDF
    The main purpose of public participation in integrated water resources modelling is to improve decision-making by ensuring that decisions are soundly based on shared knowledge, experience and scientific evidence. The present paper describes stakeholder involvement in the modelling process. The point of departure is the guidelines for quality assurance for `scientific` water resources modelling developed under the EU research project HarmoniQuA, which has developed a computer based Modelling Support Tool (MoST) to provide a user-friendly guidance and a quality assurance framework that aim for enhancing the credibility of river basin modelling. MoST prescribes interaction, which is a form of participation above consultation but below engagement of stakeholders and the public in the early phases of the modelling cycle and under review tasks throughout the process. MoST is a flexible tool which supports different types of users and facilitates interaction between modeller, manager and stakeholders. The perspective of using MoST for engagement of stakeholders e.g. higher level participation throughout the modelling process as part of integrated water resource management is evaluate

    Applying a User-centred Approach to Interactive Visualization Design

    Get PDF
    Analysing users in their context of work and finding out how and why they use different information resources is essential to provide interactive visualisation systems that match their goals and needs. Designers should actively involve the intended users throughout the whole process. This chapter presents a user-centered approach for the design of interactive visualisation systems. We describe three phases of the iterative visualisation design process: the early envisioning phase, the global specification hase, and the detailed specification phase. The whole design cycle is repeated until some criterion of success is reached. We discuss different techniques for the analysis of users, their tasks and domain. Subsequently, the design of prototypes and evaluation methods in visualisation practice are presented. Finally, we discuss the practical challenges in design and evaluation of collaborative visualisation environments. Our own case studies and those of others are used throughout the whole chapter to illustrate various approaches

    Designing for a Repository of Virtual Crisis Management Tabletop Exercises – Lessons Learned from a Scandinavian R&D Project

    Get PDF
    Crisis training exercises play a vital role in preparing local and regional governments for the management of crises and disasters. Unfortunately, conducting sufficient training is demanding, especially in small municipalities, due to constrained time and personnel resources, but also complex planning and scheduling of the dominant on-site training methods. Virtual training has been suggested as a resource-efficient and flexible complement. However, despite numerous specifications of digital technology for training, research on organisational implementation and usage is lacking, indicating a low uptake. This article presents a cross-border R&D effort to facilitate the digitalisation of crisis management training by developing generic virtual tabletop exercises (VTTXs) to be shared via a repository, and (re-)used in, and adapted for, diverse contexts. The purpose of this article is to identify essential aspects in designing and conducting virtual tabletop exercises (VTTXs) for collaborative crisis management training

    A Method Impact Assessment Framework for User Experience Evaluations with Children

    Get PDF
    Based upon a review of the literature, this paper presents a Method Impact Assessment Framework. Theoretically synthesized, the framework offers five dimensions: (1) the role of the child, (2) the user experience construct, (3) system, (4) epistemological perspective, (5) Practical and Ethical Concerns. Although other dimensions could have been construed, these were judged to be the most pertinent to understanding evaluation methods with children. The framework thus provides a critical lens in which evaluation methods can be assessed by the Children Computer Interaction (CCI) Community to inform method selection

    Chapter 5: Evaluation

    Get PDF
    The OTiS (Online Teaching in Scotland) programme, run by the now defunct Scotcit programme, ran an International e-Workshop on Developing Online Tutoring Skills which was held between 8–12 May 2000. It was organised by Heriot–Watt University, Edinburgh and The Robert Gordon University, Aberdeen, UK. Out of this workshop came the seminal Online Tutoring E-Book, a generic primer on e-learning pedagogy and methodology, full of practical implementation guidelines. Although the Scotcit programme ended some years ago, the E-Book has been copied to the SONET site as a series of PDF files, which are now available via the ALT Open Access Repository. The editor, Carol Higgison, is currently working in e-learning at the University of Bradford (see her staff profile) and is the Chair of the Association for Learning Technology (ALT)
    • …
    corecore