17 research outputs found

    Towards a Time Series Approach for the Classification and Evaluation of Collaborative Activities

    Get PDF
    The analysis and evaluation of computer-supported collaborative activities is a complex and tedious task. However, it is necessary in order to support collaborative scenarios, to scaffold the collaborative knowledge building and to evaluate the learning outcome. Various automated techniques have been proposed to minimize the workload of human evaluators and speed up the process. In this study, we propose a memory based learning model for the analysis, classification and evaluation of collaborative activities that makes use of time series techniques along with logfile analysis. We argue that the classification of collaborative sessions, with respect to their time series attributes, may be related to their qualitative aspects. Based on this rationale, we explore the use of the model under various settings. The results of the model are compared to assessments made by expert evaluators using a rating scheme. Correlation and error analyses are further conducted

    Delving into instructor‐led feedback interventions informed by learning analytics in massive open online courses

    Get PDF
    Producción CientíficaBackground:Providing feedback in massive open online courses (MOOCs) is chal-lenging due to the massiveness and heterogeneity of learners' population. Learninganalytics (LA) solutions aim at scaling up feedback interventions and supportinginstructors in this endeavour.Paper Objectives:This paper focuses on instructor-led feedback mediated by LAtools in MOOCs. Our goal is to answer how, to what extent data-driven feedback isprovided to learners, and what its impact is.Methods:We conducted a systematic literature review on the state-of-the-art LA-informed instructor-led feedback in MOOCs. From a pool of 227 publications, weselected 38 articles that address the topic of LA-informed feedback in MOOCs medi-ated by instructors. We applied etic content analysis to the collected data.Results and Conclusions:The results revealed a lack of empirical studies exploring LA todeliver feedback, and limited attention on pedagogy to inform feedback practices. Our find-ings suggest the need for systematization and evaluation of feedback. Additionally, there isa need for conceptual tools to guide instructors' in the design of LA-based feedback.Takeaways:We point out the need for systematization and evaluation of feedback. Weenvision that this research can support the design of LA-based feedback, thus contribut-ing to bridge the gap between pedagogy and data-driven practice in MOOCs.Consejo de Investigación de Estonia (PSG286)Ministerio de Ciencia e Innovación - Fondo Europeo de Desarrollo Regional y la Agencia Nacional de Investigación (grant PID2020-112584RB-C32) and (grant TIN2017-85179-C3-2-R)Junta de Castilla y León - Fondo Social Europeo y el Consejo Regional de Educación (grant E-47-2018-0108488

    Building Arguments Together or Alone? Using Learning Analytics to Study the Collaborative Construction of Argument Diagrams

    Get PDF
    Research has shown that the construction of visual representations may have a positive effect on cognitive skills, including argumentation. In this paper we present a study on learning argumentation through computer-supported argument diagramming. We specifically focus on whether students, when provided with an argument-diagramming tool, create better diagrams, are more motivated, and learn more when working with other students or on their own. We use learning analytics to evaluate a variety of student activities: pre and post questionnaires to explore motivational changes; the argument diagrams created by students to evaluate richness, complexity and completion; and pre and post knowledge tests to evaluate learning gains

    Exploring Deviation in Inquiry Learning: Degrees of Freedom or Source of Problems?

    Get PDF
    The European Go-Lab project aims to promote Inquiry-based Learning (IBL) with online laboratories. To support teachers and students in this endeavor, the project provides an IBL model (a sequence of inquiry phases) as well as the technological infrastructure to implement it: the Graasp platform and the Golabz repository. Using these technologies, teachers create Inquiry Learning Spaces (ILSs) where they adapt the proposed IBL model to their needs, and enrich each one of its phases with online resources, apps or labs to build a web-based learning environment and distribute it to the students. The aim of this paper is to reflect on the deviations from the model proposed within the project on the teacher model and the way students adapt it. For that purpose, we analyzed the 102 most frequently used ILSs with respect to the perspectives of teachers and students. The results show deviations of the authored spaces from the pedagogical model of inquiry learning as well as deviations in the actual learning process models from the teachers’ specifications. Additionally, the analysis points out best practices for the learning design, particularly the inclusion of resources and apps into the spaces

    augMENTOR-D2.1-RRI Survey Data M6

    Full text link
    <p>This dataset contains the responses to the RRI survey that took place on M6 for the project augMENTOR.</p> <p>The data have been collected with the participants' informed consent. No personal or sensitive data were collected.</p> <p>More information about the survey, the purpose and the results can be found in the deliverable D2.1 of the project.</p> <p> </p&gt

    Correction to: “From Making to Learning”: introducing Dev Camps as an educational paradigm for Re-inventing Project-based Learning

    Full text link
    Correction After the publication of this work [1] an error was noticed in the title of the article. The title in the original article reads: “From Making to Learning”: introducing Dev Camps as an educational paradigm for Re-inventing Problem-based Learning. However the correct article title should read: “From Making to Learning”: introducing Dev camps as an educational paradigm for Re-inventing Project-based Learning. This has been corrected here

    Global Trends in Scientific Debates on Trustworthy and Ethical Artificial Intelligence and Education

    Full text link
    This paper presents a systematic review of the scientific literature on trustworthy and ethical Artificial Intelligence (AI) and Education (AI&ED), including both AI applied in education to support teaching and learning (AIED), as well as education about AI (AI literacy). Key interest is the identification of global trends with a special focus on unbalanced disparities. Strictly following the standardised protocol and the underlying PRISMA approach, 324 records were identified and selected according to the pre-defined protocol for the systematic review. Finally, 62 articles were included in the quantitative and qualitative analysis in response to four research questions: Which (i) journals, (ii) disciplines, and (iii) regions are leading scientific debates and sustainable developments in education and trustworthy/ethical AI, and (iv) what are the past trends? The articles revealed an unbalanced distribution across the various dimensions, together with an exponential growth over recent years. Building upon our analysis, we argue for an increase in interdisciplinary research that shifts the focus from the currently dominant technological focus towards a more human-centered (educational and societal) focus. Only through such a development AI can contribute effectively to the UN Sustainable Development Goal no. 4 of a world with equitable and universal access to quality education. The results of our systematic review provide the basis to address and facilitate equality in the future AI&ED progress across regions worldwide

    “From Making to Learning”: introducing Dev Camps as an educational paradigm for Re-inventing Problem-based Learning

    Full text link
    Abstract Dev Camps are events that enable participants to tackle challenges using software tools and different kinds of hardware devices in collaborative project-style activities. The participants conceptualize and develop their solutions in a self-directed way, involving technical, organizational and social skills. In this sense, they are autonomous producers or “makers”. The Dev Camp activity format resonates with skills such as communication, critical thinking, creativity, decision-making and planning and can be considered as a bridge between education and industry. In this paper we present and analyse our experience from a series of such events that were co-organized between an industrial partner acting as a host and several university partners. We take this as an indication to envision new opportunities for project-based learning in more formal educational scenarios
    corecore