528 research outputs found

    Contextual Inquiry Reflection Tools

    Get PDF
    Contextual Inquiry Reflection Tools, Deliverable 5.3, weSPOT ProjectweSPOT Project IST (FP7/2007-2013) under grant agreement N° 318499

    Analyzing the behavior of students regarding learning activities, badges, and academic dishonesty in MOOC environment

    Get PDF
    Mención Internacional en el título de doctorThe ‘big data’ scene has brought new improvement opportunities to most products and services, including education. Web-based learning has become very widespread over the last decade, which in conjunction with the Massive Open Online Course (MOOC) phenomenon, it has enabled the collection of large and rich data samples regarding the interaction of students with these educational online environments. We have detected different areas in the literature that still need improvement and more research studies. Particularly, in the context of MOOCs and Small Private Online Courses (SPOCs), where we focus our data analysis on the platforms Khan Academy, Open edX and Coursera. More specifically, we are going to work towards learning analytics visualization dashboards, carrying out an evaluation of these visual analytics tools. Additionally, we will delve into the activity and behavior of students with regular and optional activities, badges and their online academically dishonest conduct. The analysis of activity and behavior of students is divided first in exploratory analysis providing descriptive and inferential statistics, like correlations and group comparisons, as well as numerous visualizations that facilitate conveying understandable information. Second, we apply clustering analysis to find different profiles of students for different purposes e.g., to analyze potential adaptation of learning experiences and pedagogical implications. Third, we also provide three machine learning models, two of them to predict learning outcomes (learning gains and certificate accomplishment) and one to classify submissions as illicit or not. We also use these models to discuss about the importance of variables. Finally, we discuss our results in terms of the motivation of students, student profiling, instructional design, potential actuators and the evaluation of visual analytics dashboards providing different recommendations to improve future educational experiments.Las novedades en torno al ‘big data’ han traído nuevas oportunidades de mejorar la mayoría de productos y servicios, incluyendo la educación. El aprendizaje mediante tecnologías web se ha extendido mucho durante la última década, que conjuntamente con el fenómeno de los cursos abiertos masivos en línea (MOOCs), ha permitido que se recojan grandes y ricas muestras de datos sobre la interacción de los estudiantes con estos entornos virtuales de aprendizaje. Nosotros hemos detectado diferentes áreas en la literatura que aún necesitan de mejoras y del desarrollo de más estudios, específicamente en el contexto de MOOCs y cursos privados pequeños en línea (SPOCs). En la tesis nos hemos enfocado en el análisis de datos en las plataformas Khan Academy, Open edX y Coursera. Más específicamente, vamos a trabajar en interfaces de visualizaciones de analítica de aprendizaje, llevando a cabo la evaluación de estas herramientas de analítica visual. Además, profundizaremos en la actividad y el comportamiento de los estudiantes con actividades comunes y opcionales, medallas y sus conductas en torno a la deshonestidad académica. Este análisis de actividad y comportamiento comienza primero con análisis exploratorio proporcionando variables descriptivas y de inferencia estadística, como correlaciones y comparaciones entre grupos, así como numerosas visualizaciones que facilitan la transmisión de información inteligible. En segundo lugar aplicaremos técnicas de agrupamiento para encontrar distintos perfiles de estudiantes con diferentes propósitos, como por ejemplo para analizar posibles adaptaciones de experiencias educativas y sus implicaciones pedagógicas. También proporcionamos tres modelos de aprendizaje máquina, dos de ellos que predicen resultados finales de aprendizaje (ganancias de aprendizaje y la consecución de certificados de terminación) y uno para clasificar que ejercicios han sido entregados de forma deshonesta. También usaremos estos tres modelos para analizar la importancia de las variables. Finalmente, discutimos todos los resultados en términos de la motivación de los estudiantes, diferentes perfiles de estudiante, diseño instruccional, posibles sistemas actuadores, así como la evaluación de interfaces de analítica visual, proporcionando recomendaciones que pueden ayudar a mejorar futuras experiencias educacionales.Programa Oficial de Doctorado en Ingeniería TelemáticaPresidente: Davinia Hernández Leo.- Secretario: Luis Sánchez Fernández.- Vocal: Adolfo Ruiz Callej

    Evaluation of a Learning Analytics Application for Open edX Platform

    Get PDF
    Massive open online courses (MOOCs) have recently emerged as a revolution in education. Due to the huge amount of users, it is difficult for teachers to provide personalized instruction. Learning analytics computer applications have emerged as a solution. At present, MOOC platforms provide low support for learning analytics visualizations, and a challenge is to provide useful and effective visualization applications about the learning process. At this paper we review the learning analytics functionality of Open edX and make an overview of our learning analytics application ANALYSE. We present a usability and effectiveness evaluation of ANALYSE tool with 40 students taking a Design of Telematics Applications course. The survey obtained very positive results in a system usability scale (SUS) questionnaire (78.44/100) in terms of the usefulness of visualizations (3.68/5) and the effectiveness ratio (92/100) of the actions required for the respondents. Therefore, we can conclude that the implemented learning analytics application is usable and effective.Acknowledgements: This work has been supported by the "eMadrid" project (Regional Government of Madrid) under grant S2013/ICE-2715, the "RESET" project (Ministry of Economy and Competiveness) under grant RESET TIN2014-53199-C3-1-R and the European Erasmus+ SHEILA project under grant 562080-EPP-1-2015-BE-EPPKA3-PI-FORWARD

    Expectation-Centered Analytics for Instructors and Students

    Get PDF
    Learning analytics is the measurement, collection, analysis, and reporting of data about learners and their contexts. An outcome and primary goal of learning analytics should be to inform instructors, who are primary stakeholders, so that they can make effective decisions in their courses. To support instructor inquiry, I apply theory on reflective practice to learning analytic development. Articulating an instructor\u27s pedagogical expectations is one way to begin facilitating a reflective practice. Expectations based on instructor goals serve as a natural next step and the springboard from which data can be collected. I hypothesize that a learning analytic that encodes and reifies instructors\u27 individual expectations will better support reflective practice for instructors and allow students to more reliably meet set expectations. I took a user-centered approach to learning analytic research and development. First I triangulated empirical analysis of analytic use with focus groups to understand how instructors interacted with analytics. Instructors had a wide range of behaviors, needs and expectations. For most instructors, analytics were used very briefly (less than 1 minute). Instructors also requested a way to aggregate data from different analytics to better support their information needs. Based on these findings, I developed learning analytics within TrACE to allow for instructors to specify expectations and see student progress related to those expectations. Students could also view their progress towards completing expectations. Finally, I conducted a field study to compare both instructor analytic use and student compliance to expectations without and with the presence of these analytics. The results of the field study did not support the hypothesis. Instructors for the most part did not change their behaviors with the introduction of these analytics. Students also did not meet expectations more reliably, but one course saw a significant improvement in performance. Without visible expectations, students met significantly fewer posting expectations than other expectations. With explicit expectations, posting performance was no longer significantly less

    Game-Based Learning, Gamification in Education and Serious Games

    Get PDF
    The aim of this book is to present and discuss new advances in serious games to show how they could enhance the effectiveness and outreach of education, advertising, social awareness, health, policies, etc. We present their use in structured learning activities, not only with a focus on game-based learning, but also on the use of game elements and game design techniques to gamify the learning process. The published contributions really demonstrate the wide scope of application of game-based approaches in terms of purpose, target groups, technologies and domains and one aspect they have in common is that they provide evidence of how effective serious games, game-based learning and gamification can be

    Quantified Self Analytics Tools for Self-regulated Learning with myPAL

    Get PDF
    One of the major challenges in higher education is developing self-regulation skills for lifelong learning. We address this challenge within the myPAL project, in medical education context, utilising the vast amount of student assessment and feedback data collected throughout the programme. The underlying principle of myPAL is Quantified Self -- the use of personal data to enable students to become lifelong learners. myPAL is facilitating this with learning analytics combined with interactive nudges. This paper reviews the state of the art in Quantified Self analytics tools to identify what approaches can be adopted in myPAL and what gaps require further research. The paper contributes to awareness and reflection in technology-enhanced learning by: (i) identifying requirements for intelligent personal adaptive learning systems that foster self-regulation (using myPAL as an example); (ii) analysing the state of the art in text analytics and visualisation related to Quantified Self for self-regulated learning; and (iii) identifying open issues and suggesting possible ways to address them

    The Big Five:Addressing Recurrent Multimodal Learning Data Challenges

    Get PDF
    The analysis of multimodal data in learning is a growing field of research, which has led to the development of different analytics solutions. However, there is no standardised approach to handle multimodal data. In this paper, we describe and outline a solution for five recurrent challenges in the analysis of multimodal data: the data collection, storing, annotation, processing and exploitation. For each of these challenges, we envision possible solutions. The prototypes for some of the proposed solutions will be discussed during the Multimodal Challenge of the fourth Learning Analytics & Knowledge Hackathon, a two-day hands-on workshop in which the authors will open up the prototypes for trials, validation and feedback

    Multimodal Challenge: Analytics Beyond User-computer Interaction Data

    Get PDF
    This contribution describes one the challenges explored in the Fourth LAK Hackathon. This challenge aims at shifting the focus from learning situations which can be easily traced through user-computer interactions data and concentrate more on user-world interactions events, typical of co-located and practice-based learning experiences. This mission, pursued by the multimodal learning analytics (MMLA) community, seeks to bridge the gap between digital and physical learning spaces. The “multimodal” approach consists in combining learners’ motoric actions with physiological responses and data about the learning contexts. These data can be collected through multiple wearable sensors and Internet of Things (IoT) devices. This Hackathon table will confront with three main challenges arising from the analysis and valorisation of multimodal datasets: 1) the data collection and storing, 2) the data annotation, 3) the data processing and exploitation. Some research questions which will be considered in this Hackathon challenge are the following: how to process the raw sensor data streams and extract relevant features? which data mining and machine learning techniques can be applied? how can we compare two action recordings? How to combine sensor data with Experience API (xAPI)? what are meaningful visualisations for these data
    corecore