18 research outputs found

    Stability and sensitivity of Learning Analytics based prediction models

    Get PDF
    Learning analytics seek to enhance the learning processes through systematic measurements of learning related data and to provide informative feedback to learners and educators. Track data from Learning Management Systems (LMS) constitute a main data source for learning analytics. This empirical contribution provides an application of Buckingham Shum and Deakin Crick’s theoretical framework of dispositional learning analytics: an infrastructure that combines learning dispositions data with data extracted from computer-assisted, formative assessments and LMSs. In two cohorts of a large introductory quantitative methods module, 2049 students were enrolled in a module based on principles of blended learning, combining face-to-face Problem-Based Learning sessions with e-tutorials. We investigated the predictive power of learning dispositions, outcomes of continuous formative assessments and other system generated data in modelling student performance and their potential to generate informative feedback. Using a dynamic, longitudinal perspective, computer-assisted formative assessments seem to be the best predictor for detecting underperforming students and academic performance, while basic LMS data did not substantially predict learning. If timely feedback is crucial, both use-intensity related track data from e-tutorial systems, and learning dispositions, are valuable sources for feedback generation

    Researchers perceptions of DH trends and topics in the English and Spanish-speaking community : DayofDH data as a case study

    Get PDF
    Defining the state of the art in Digital Humanities (DH) is a really challenging task, given the range of contents that this tag covers. One of the most successful efforts in this sense has been the international blogging event known as DayofDH or A Day in the Life of the Digital Humanities project, promoted and sponsored by centerNet ( http://www.dhcenternet.org/), which has put together digital humanists from around the world to document once a year what they do (Rockwell et al., 2012). The websites of DayofDH were hosted in North America until 2015, when it was coordinated in Europe by LINHD ( http://linhd.uned.es), the Digital Innovation Lab, at UNED in Madrid. Participants belong to several countries around the world.The relevance of DH in non-English speaking countries has been quick and important in the last decade, and especially important in the Spanish-speaking world (Spence and GonzĂĄlez-Blanco, 2014; GonzĂĄlez-Blanco, 2013; Del Rio Riande, 2014a; Del Rio Riande, 2014b; Galina et al., 2015). Technological projects for humanities have existed in the Spanish world for many years; however, the discipline called "Digital Humanities" arose in 2011 with the first meeting that originated the Spanish Digital Humanities Association, HDH. This relevance is reflected in the creation of a parallel version of the DayofDH in Spanish, the DĂ­aHD, which was hosted by the UNAM in Mexico in 2013 and 2014 and converged in the last initiative at UNED transforming both blogging events into a bilingual version of the Day.Consejo Nacional de Investigaciones CientĂ­ficas y TĂ©cnicas (CONICET

    Integration of Multiple Data Sources for predicting the Engagement of Students in Practical Activities

    Get PDF
    This work presents the integration of an automatic assessment system for virtual/remote laboratories and the institutional Learning Management System (LMS), in order to analyze the students’ progress and their collaborative learning in virtual/remote laboratories. As a result of this integration, it is feasible to extract useful information for the characterization of the students’ learning process and detecting the students’ engagement with the practical activities of our subjects. From this integration, a dashboard has been created to graphically present to lecturers the analyzed results. Thanks to this, faculty can use the analyzed information in order to guide the learning/teaching process of each student. As an example, a subject focused on the configuration of network services has been chosen to implement our proposal

    TURNITIN? TURNITOFF: The Deskilling of Information Literacy

    Get PDF
    CC-BY-NC-SAPlagiarism is a folk devil into which is poured many of the challenges, problems and difficulties confronting higher education. This article investigates how software -Turnitin in particular- is ‘solving’ a particular ‘crisis’ in universities. However, I investigate how alternative strategies for the development of information literacy offer concrete, productive and imaginative trajectories for university staff and students

    Analytics4Action Evaluation Framework: A Review of Evidence-Based Learning Analytics Interventions at the Open University UK

    Get PDF
    There is an urgent need to develop an evidence-based framework for learning analytics whereby stakeholders can manage, evaluate, and make decisions about which types of interventions work well and under which conditions. In this article, we will work towards developing a foundation of an Analytics4Action Evaluation Framework (A4AEF) that is currently being tested and validated at the Open University UK. By working with 18 introductory large-scale modules for a period of two years across the five faculties and disciplines within the OU, Analytics4Action provides a bottom-up-approach for working together with key stakeholders within their respective contexts. A holistic A4AEF has been developed to unpack, understand and map the six key steps in the evidence-based intervention process. By means of an exemplar in health and social science, a practical illustration of A4AEF is provided. In the next 3-5 years, we hope that a rich, robust evidence-base will be presented to show how learning analytics can help teachers to make informed, timely and successful interventions that will help each learner to achieve the module’s learning outcomes

    Who are the top contributors in a MOOC? Relating participants' performance and contributions

    Get PDF
    The role of social tools in massive open online courses (MOOCs) is essential as they connect participants. Of all the participants in an MOOC, top contributors are the ones who more actively contribute via social tools. This article analyses and reports empirical data from five different social tools pertaining to an actual MOOC to characterize top contributors and provide some insights aimed at facilitating their early detection. The results of this analysis show that top contributors have better final scores than the rest. In addition, there is a moderate positive correlation between participants' overall performance (measured in terms of final scores) and the number of posts submitted to the five social tools. This article also studies the effect of participants' gender and scores as factors that can be used for the early detection of top contributors. The analysis shows that gender is not a good predictor and that taking the scores of the first assessment activities of each type (test and peer assessment in the case study) results in a prediction that is not substantially improved by adding subsequent activities. Finally, better predictions based on scores are obtained for aggregate contributions in the five social tools than for individual contributions in each social tool.This work has been partially funded by the Madrid Regional Government eMadrid Excellence Network (S2013/ICE-2715), the Spanish Ministry of Economy and Competitiveness Project RESET (TIN2014-53199-C3-1-R) and the European Erasmus+ projects MOOC-Maker (561533-EPP-1-2015-1-ES-EPPKA2-CBHE-JP) and SHEILA (562080-EPP-1-2015-BE-EPPKA3-PI-FORWARD).Publicad

    Developing a Multidimensional Framework for Analyzing Student Comments in Wikis

    Get PDF
    corecore