3 research outputs found

    Online Collaborative Learning in a Project-Based Learning Environment in Taiwan: A Case Study on Undergraduate Students’ Perspectives

    Get PDF
    This case study investigated undergraduate students’ first experience in online collaborative learning in a project-based learning (PBL) environment in Taiwan. Data were collected through interviews of 48 students, instructor’s field notes, researchers\u27 online observations, students’ online discourse and group artifacts. The findings revealed interesting phenomena as results of cultural influences as well as educational system impacts. Students experienced first handed various learning benefits of PBL in the intensive six-week period, yet voiced serious concerns about the changed role of the instructor, as well as strong reservations on peer collaboration as a result of the competitive tradition in education. Obviously, online collaborative learning and PBL critically challenged some culturally-rooted traditions in Taiwan. The study generates practical insights into the applications of online collaborative learning and PBL in Taiwan\u27s higher education as well as implications for cross-cultural implementation of online learning

    Design teamwork in distributed intercultural teams : competition, collaboration, cooperation

    Get PDF

    Enhancing Questionnaire Design Through Participant Engagement to Improve the Outputs of Evaluation.

    Get PDF
    Questionnaires are habitual choices for many user experience evaluators, providing a well-recognised and accepted, fast and cost effective method of collecting and analysing data. However, despite frequent and widespread use in evaluation, reliance on questionnaires can be problematic. Satisficing, acquiescence bias and straight lining are common response biases associated with questionnaires, typically resulting in suboptimal responses and provision of poor quality data. These problems can relate to a lack of engagement with evaluation tasks, yet there is a lack of previous research that has attempted to alleviate these limitations by making questionnaires more fun or enjoyable to enhance participant engagement. This research seeks to address whether ‘user evaluation questionnaires can be designed to be engaging to improve optimal responding. The aim of this research is to investigate if response quality can be improved through enhancing questionnaire design both to reduce common response biases and to maintain participant engagement. The evaluation context for this study was provided by MIXER, an interactive, narrative-based application for intercultural sensitivity learning, used and evaluated by 9-11 year old children in the classroom context. A series of Participatory Design studies with children investigated engagement and optimal responding with questionnaires. These initial studies informed the design of a series of questionnaires created in the form of three workbooks that were used to evaluate MIXER with over 400 children. 3 A mixed methods approach was used to evaluate the questionnaires. Results demonstrate that by making questionnaire completion more enjoyable data quality is improved. Response biases are reduced, quantitative data are more complete and qualitative responses are more verbose and meaningful compared to standard questionnaires. Further, children reported that completing the questionnaires was a fun and enjoyable activity that they would wish to repeat in the future. As a discipline in its own right, evaluation is under-investigated. Similarly user evaluation is not evaluated with a lack of papers considering this issue in this millennium. Thus, this research provides a significant contribution to the field of evaluation, highlighting that the outputs of user evaluation with questionnaires are improved when participant engagement informs questionnaire design. The result is a more positive evaluation experience for participants and in return a higher standard of data provision for evaluators and R&D teams
    corecore