2,650 research outputs found

    The Multimodal Tutor: Adaptive Feedback from Multimodal Experiences

    Get PDF
    This doctoral thesis describes the journey of ideation, prototyping and empirical testing of the Multimodal Tutor, a system designed for providing digital feedback that supports psychomotor skills acquisition using learning and multimodal data capturing. The feedback is given in real-time with machine-driven assessment of the learner's task execution. The predictions are tailored by supervised machine learning models trained with human annotated samples. The main contributions of this thesis are: a literature survey on multimodal data for learning, a conceptual model (the Multimodal Learning Analytics Model), a technological framework (the Multimodal Pipeline), a data annotation tool (the Visual Inspection Tool) and a case study in Cardiopulmonary Resuscitation training (CPR Tutor). The CPR Tutor generates real-time, adaptive feedback using kinematic and myographic data and neural networks

    Massive Open Online Courses as affinity spaces for connected learning: Exploring effective learning interactions in one massive online community

    Get PDF
    This paper describes a participatory online culture – Connected Learning Massive Open Online Collaboration (CLMOOC) – and asks how its ethos of reciprocity and creative playfulness occurs. By analysing Twitter interactions over a four-week period, we conclude that this is due to the supportive nature of participants, who describe themselves as belonging to, or connected with, the community. We suggest that Gee’s concept of an affinity space is an appropriate model for CLMOOC and ask how this might be replicated in a higher education setting

    Generating actionable predictions regarding MOOC learners' engagement in peer reviews

    Get PDF
    Peer review is one approach to facilitate formative feedback exchange in MOOCs; however, it is often undermined by low participation. To support effective implementation of peer reviews in MOOCs, this research work proposes several predictive models to accurately classify learners according to their expected engagement levels in an upcoming peer-review activity, which offers various pedagogical utilities (e.g. improving peer reviews and collaborative learning activities). Two approaches were used for training the models: in situ learning (in which an engagement indicator available at the time of the predictions is used as a proxy label to train a model within the same course) and transfer across courses (in which a model is trained using labels obtained from past course data). These techniques allowed producing predictions that are actionable by the instructor while the course still continues, which is not possible with post-hoc approaches requiring the use of true labels. According to the results, both transfer across courses and in situ learning approaches have produced predictions that were actionable yet as accurate as those obtained with cross validation, suggesting that they deserve further attention to create impact in MOOCs with real-world interventions. Potential pedagogical uses of the predictions were illustrated with several examples

    Generating actionable predictions regarding MOOC learners’ engagement in peer reviews

    Get PDF
    Producción CientíficaPeer review is one approach to facilitate formative feedback exchange in MOOCs; however, it is often undermined by low participation. To support effective implementation of peer reviews in MOOCs, this research work proposes several predictive models to accurately classify learners according to their expected engagement levels in an upcoming peer-review activity, which offers various pedagogical utilities (e.g. improving peer reviews and collaborative learning activities). Two approaches were used for training the models: in situ learning (in which an engagement indicator available at the time of the predictions is used as a proxy label to train a model within the same course) and transfer across courses (in which a model is trained using labels obtained from past course data). These techniques allowed producing predictions that are actionable by the instructor while the course still continues, which is not possible with post-hoc approaches requiring the use of true labels. According to the results, both transfer across courses and in situ learning approaches have produced predictions that were actionable yet as accurate as those obtained with cross validation, suggesting that they deserve further attention to create impact in MOOCs with real-world interventions. Potential pedagogical uses of the predictions were illustrated with several examples.European Union’s Horizon 2020 research and innovation programme (Marie Sklodowska-Curie grant 793317)Ministerio de Ciencia, Innovación y Universidades (projects TIN2017-85179-C3-2-R / TIN2014-53199-C3-2-R)Junta de Castilla y León (grant VA257P18)Comisión Europea (grant 588438-EPP-1-2017-1-EL-EPPKA2-KA

    Influence of employer support for professional development on MOOCs enrolment and completion: Results from a cross-course survey

    Get PDF
    Although the potential of open education and MOOCs for professional development is usually recognized, it has not yet been explored extensively. How far employers support non-formal learning is still an open question. This paper presents the findings of a survey-based study which focuses on the influence of employer support for (general) professional development on employees’ use of MOOCs. Findings show that employers are usually unaware that their employees are participating in MOOCs. In addition, employer support for general professional development is positively associated with employees completing MOOCs and obtaining certificates for them. However, the relationship between employer support and MOOC enrollment is less clear: workers who have more support from their employers tend to enroll in either a low or a high number of MOOCs. Finally, the promotion of a minimum of ICT skills by employers is shown to be an effective way of encouraging employee participation in the open education ecosystem.JRC.J.3-Information Societ

    An Examination of Online Learning Security Requirements Within a Virtual Learning Environment of an Irish University

    Get PDF
    As the adoption of e-learning and need for lifelong learning increases, it is vital the administrator of a virtual learning environment continually ensures reliable and secure data. This case study engaged in the initial steps of analyzing the use and security needs of a virtual learning service within a university of Ireland. The university provided two virtual learning services which were comparatively analyzed, from a security and data protection perspective. In addition, survey results obtained from the university user community for one of the e-learning services were examined. Findings from the study were presented as user security requirements and recommendations, when planning future security initiatives of the e-learning services within the university

    Applying science of learning in education: Infusing psychological science into the curriculum

    Get PDF
    The field of specialization known as the science of learning is not, in fact, one field. Science of learning is a term that serves as an umbrella for many lines of research, theory, and application. A term with an even wider reach is Learning Sciences (Sawyer, 2006). The present book represents a sliver, albeit a substantial one, of the scholarship on the science of learning and its application in educational settings (Science of Instruction, Mayer 2011). Although much, but not all, of what is presented in this book is focused on learning in college and university settings, teachers of all academic levels may find the recommendations made by chapter authors of service. The overarching theme of this book is on the interplay between the science of learning, the science of instruction, and the science of assessment (Mayer, 2011). The science of learning is a systematic and empirical approach to understanding how people learn. More formally, Mayer (2011) defined the science of learning as the “scientific study of how people learn” (p. 3). The science of instruction (Mayer 2011), informed in part by the science of learning, is also on display throughout the book. Mayer defined the science of instruction as the “scientific study of how to help people learn” (p. 3). Finally, the assessment of student learning (e.g., learning, remembering, transferring knowledge) during and after instruction helps us determine the effectiveness of our instructional methods. Mayer defined the science of assessment as the “scientific study of how to determine what people know” (p.3). Most of the research and applications presented in this book are completed within a science of learning framework. Researchers first conducted research to understand how people learn in certain controlled contexts (i.e., in the laboratory) and then they, or others, began to consider how these understandings could be applied in educational settings. Work on the cognitive load theory of learning, which is discussed in depth in several chapters of this book (e.g., Chew; Lee and Kalyuga; Mayer; Renkl), provides an excellent example that documents how science of learning has led to valuable work on the science of instruction. Most of the work described in this book is based on theory and research in cognitive psychology. We might have selected other topics (and, thus, other authors) that have their research base in behavior analysis, computational modeling and computer science, neuroscience, etc. We made the selections we did because the work of our authors ties together nicely and seemed to us to have direct applicability in academic settings

    Learning Analytics: Translating Data into “Just-in-Time” Interventions

    Get PDF
    Despite the burgeoning studies on student attrition and retention, many institutions continue to deal with related issues, including D, F, and W grades rates. The emerging and rapidly developing Learning Analytics (LA) field shows great potential for improving learning outcomes by monitoring and analyzing student performance to allow instructors to recommend specific interventions based on key performance indicators. Unfortunately, higher education has been slow to implement it. We, therefore, provide the rationale and benefits of increased LA integration into courses and curriculum. We further identify and suggest ready-to-implement best practices, as well as tools available in Learning Management Systems (LMSs) and other helpful resources
    • …
    corecore