2,585 research outputs found

    Which Data Sets Are Preferred by University Students in Learning Analytics Dashboards? A Situated Learning Theory Perspective

    Get PDF

    Learning Analytics Dashboards for Advisors -- A Systematic Literature Review

    Full text link
    Learning Analytics Dashboard for Advisors is designed to provide data-driven insights and visualizations to support advisors in their decision-making regarding student academic progress, engagement, targeted support, and overall success. This study explores the current state of the art in learning analytics dashboards, focusing on specific requirements for advisors. By examining existing literature and case studies, this research investigates the key features and functionalities essential for an effective learning analytics dashboard tailored to advisor needs. This study also aims to provide a comprehensive understanding of the landscape of learning analytics dashboards for advisors, offering insights into the advancements, opportunities, and challenges in their development by synthesizing the current trends from a total of 21 research papers used for analysis. The findings will contribute to the design and implementation of new features in learning analytics dashboards that empower advisors to provide proactive and individualized support, ultimately fostering student retention and academic success

    Student-Centered Learning: Functional Requirements for Integrated Systems to Optimize Learning

    Get PDF
    The realities of the 21st-century learner require that schools and educators fundamentally change their practice. "Educators must produce college- and career-ready graduates that reflect the future these students will face. And, they must facilitate learning through means that align with the defining attributes of this generation of learners."Today, we know more than ever about how students learn, acknowledging that the process isn't the same for every student and doesn't remain the same for each individual, depending upon maturation and the content being learned. We know that students want to progress at a pace that allows them to master new concepts and skills, to access a variety of resources, to receive timely feedback on their progress, to demonstrate their knowledge in multiple ways and to get direction, support and feedback from—as well as collaborate with—experts, teachers, tutors and other students.The result is a growing demand for student-centered, transformative digital learning using competency education as an underpinning.iNACOL released this paper to illustrate the technical requirements and functionalities that learning management systems need to shift toward student-centered instructional models. This comprehensive framework will help districts and schools determine what systems to use and integrate as they being their journey toward student-centered learning, as well as how systems integration aligns with their organizational vision, educational goals and strategic plans.Educators can use this report to optimize student learning and promote innovation in their own student-centered learning environments. The report will help school leaders understand the complex technologies needed to optimize personalized learning and how to use data and analytics to improve practices, and can assist technology leaders in re-engineering systems to support the key nuances of student-centered learning

    The Effects of Student Activity Dashboards on Student Participation, Performance, and Persistence

    Get PDF
    Researchers have turned their attention to the use of learning analytics and dashboard systems in education. Schools are using knowledge gained in this area to address the issue of persistence to increase graduation rates. While dashboard systems have been developed and are starting to be implemented, it is not yet clear how activity and performance data from dashboards influences student behavior. In addition, much of the research has been focused on instructor-facing dashboards rather than student-facing dashboards. The current study implemented a student-facing dashboard in the learning management system and measured how information on the dashboard may have influenced participation in discussions, student performance on graded items, and persistence in future courses. A dashboard tool was developed for this study. Activity, performance, and persistence data was collected from all participating students. The study followed an experimental design approach that involved assigning a random group of students from multiple courses to a dashboard tool which showed the individual student’s activity and performance compared with that of their peers. Activity indicators included frequency of posting, average length of posts, percent of posts made to peers, and percent of posts made to instructor. The current score for the student, as a measure of performance, was also shown on the dashboard along with the current class average. Multivariate analysis of variance (MANOVA) was used to determine whether there were statistically significant differences in participation as measured by number of posts, word count of posts, and percent of posts to peers or performance as measured by final grade. Chi Squared analysis was used to determine whether there were significant differences in persistence as a measure of whether students registered for and attended the following session. The analysis of results indicated no significant differences in participation or performance between the experimental and control groups (f(4, 59) = .947, p = .443). Similarly, no significant differences were found in persistence between the two groups (χ2(2) = .960, p = .619). Further research is needed to more fully understand the use of student dashboard interfaces and their impact on student behavior. Future studies using a similar methodology should incorporate larger sample sizes and include all students in the class, rather than using self-selected samples. A better understanding of how the use of dashboards influences participation, performance, and persistence is needed in order to develop effective strategies for supporting students

    Investigating a learning analytics interface for automatically marked programming assessments

    Get PDF
    Student numbers at the University of Cape Town continue to grow, with an increasing number of students enrolling to study programming courses. With this increase in numbers, it becomes difficult for lecturers to provide individualised feedback on programming assessments submitted by students. To solve this, the university utilises an automatic marking tool for marking assignments and providing feedback. Students can submit assignments and receive instant feedback on marks allocated or errors in their submissions. This tool saves time as lecturers spend less time on marking and provides instant feedback on submitted code, hence providing the student with an opportunity to correct errors in their submitted code. However, most students have identified areas where improvements can be made on the interface between the automatic marker and the submitted programs. This study investigates the potential of creating a learning analytics inspired dashboard interface to improve the feedback provided to students on their submitted programs. A focus group consisting of computer science class representatives was organised, and feedback from this focus group was used to create dashboard mock-ups. These mock-ups were then used to develop high-fidelity learning analytics inspired dashboard prototypes that were tested by first-year computer science students to determine if the interfaces were useful and usable. The prototypes were designed using the Python programming language and Plotly Python library. User-centred design methods were employed by eliciting constant feedback from students during the prototyping and design of the learning analytics inspired interface. A usability study was employed where students were required to use the dashboard and then provide feedback on its use by completing a questionnaire. The questionnaire was designed using Nielsen's Usability Heuristics and AttrakDiff. These methods also assisted in the evaluation of the dashboard design. The research showed that students considered a learning analytics dashboard as an essential tool that could help them as they learn to program. Students found the dashboard useful and had an overall understanding of the specific features they would like to see implemented on a learning analytics inspired dashboard used by the automatic marking tool. Some of the specific features mentioned by students include overall performance, duly performed needed to qualify for exams, highest score, assignment due dates, class average score, and most common errors. This research hopes to provide insight on how automatically marked programming assessments could be displayed to students in a way that supports learning

    Why Feedback Literacy Matters for Learning Analytics

    Full text link
    Learning analytics (LA) provides data-driven feedback that aims to improve learning and inform action. For learners, LA-based feedback may scaffold self-regulated learning skills, which are crucial to learning success. For teachers, LA-based feedback may help the evaluation of teaching effects and the need for interventions. However, the current development of LA has presented problems related to the cognitive, social-affective, and structural dimensions of feedback. In light of this, this position paper argues that attention needs to shift from the design of LA as a feedback product to one that facilitates a process in which both teachers and students play active roles in meaning-making. To this end, implications for feedback literacy in the context of LA are discussed.Comment: 8 pages. Accepted at the 2022 International Conference of the Learning Sciences (ICLS). https://2022.isls.org/proceedings
    • …
    corecore