31,821 research outputs found
Examining the designs of computer-based assessment and its impact on student engagement, satisfaction, and pass rates
Many researchers who study the impact of computer-based assessment (CBA) focus on the affordances or complexities of CBA approaches in comparison to traditional assessment methods. This study examines how CBA approaches were configured within and between modules, and the impact of assessment design on studentsâ engagement, satisfaction, and pass rates. The analysis was conducted using a combination of longitudinal visualisations, correlational analysis, and fixed-effect models on 74 undergraduate modules and their 72,377 students. Our findings indicate that educators designed very different assessment strategies, which significantly influenced student engagement as measured by time spent in the virtual learning environment (VLE). Weekly analyses indicated that assessment activities were balanced with other learning activities, which suggests that educators tended to aim for a consistent workload when designing assessment strategies. Since most of the assessments were computer-based, students spent more time on the VLE during assessment weeks. By controlling for heterogeneity within and between modules, learning design could explain up to 69% of the variability in studentsâ time spent on the VLE. Furthermore, assessment activities were significantly related to pass rates, but no clear relation with satisfaction was found. Our findings highlight the importance of CBA and learning design to how students learn online
Recommended from our members
A review of ten years of implementation and research in aligning learning design with learning analytics at the Open University UK
There is an increased recognition that learning design drives both student learning experience and quality enhancements of teaching and learning. The Open University UK (OU) has been one of few institutions that have explicitly and systematically captured the designs for learning at a large scale. By applying advanced analytical techniques on large and fine-grained datasets, the OU has been unpacking the complexity of instructional practices, as well as providing conceptual and empirical evidence of how learning design influences student behaviour, satisfaction, and performance. This study discusses the implementation of learning design at the OU in the last ten years, and critically reviews empirical evidence from eight recent large-scale studies that have linked learning design with learning analytics. Four future research themes are identified to support future adoptions of learning design approaches
Recommended from our members
Scholarly insight Spring 2018: a Data wrangler perspective
In the movie classic Back to the Future a young Michael J. Fox is able to explore the past by a time machine developed by the slightly bizarre but exquisite Dr Brown. Unexpectedly by some small intervention the course of history was changed a bit along Foxâs adventures. In this fourth Scholarly Insight Report we have explored two innovative approaches to learn from OU data of the past, which hopefully in the future will make a large difference in how we support our students and design and implement our teaching and learning practices. In Chapter 1, we provide an in-depth analysis of 50 thousands comments expressed by students through the Student Experience on a Module (SEAM) questionnaire. By analysing over 2.5 million words using big data approaches, our Scholarly insights indicate that not all student voices are heard. Furthermore, our big data analysis indicate useful potential insights to explore how student voices change over time, and for which particular modules emergent themes might arise.
In Chapter 2 we provide our second innovative approach of a proof-of-concept of qualification path way using graph approaches. By exploring existing data of one qualification (i.e., Psychology), we show that students make a range of pathway choices during their qualification, some of which are more successful than others. As highlighted in our previous Scholarly Insight Reports, getting data from a qualification perspective within the OU is a difficult and challenging process, and the proof-of-concept provided in Chapter 2 might provide a way forward to better understand and support the complex choices our students make.
In Chapter 3, we provide a slightly more practically-oriented and perhaps down to earth approach focussing on the lessons-learned with Analytics4Action. Over the last four years nearly a hundred modules have worked with more active use of data and insights into module presentation to support their students. In Chapter 3 several good-practices are described by the LTI/TEL learning design team, as well as three innovative case-studies which we hope will inspire you to try something new as well.
Working organically in various Faculty sub-group meetings and LTI Units and in a google doc with various key stakeholders in the Faculties, we hope that our Scholarly insights can help to inform our staff, but also spark some ideas how to further improve our module designs and qualification pathways. Of course we are keen to hear what other topics require Scholarly insight. We hope that you see some potential in the two innovative approaches, and perhaps you might want to try some new ideas in your module. While a time machine has not really been invented yet, with the increasing rich and fine-grained data about our students and our learning practices we are getting closer to understand what really drives our students
Recommended from our members
Scholarly insight Autumn 2017:a Data wrangler perspective
As the OU is going through several fundamental changes, it is important that strategic decisions made by Faculties and senior management are informed by evidence-based research and insights. One way how Data Wranglers provide insights of longitudinal development and performance of OU modules is the Key Metric Report 2017. A particular new element is that data can now also be unpacked and visualised on a Nation-level. As evidenced by the Nation-level reporting, there are substantial variations of success across the four Nations, and we hope that our interactive dashboards allow OU staff to unpack the underlying data.
The second way Data Wranglers provide insight to Faculties and Units is through the Scholarly insight report series. Building on the previous two reports whereby we reported on substantial variation and inconsistencies in learning designs and assessment practices within qualifications across the OU, in this Scholarly insight Autumn 2017 report we address four big pedagogical questions that were framed and co-constructed together with the Faculties and LTI units. Many Faculties and colleagues have reacted positively on our Scholarly insight Spring 2017 report, whereby for the first time we were able to show empirically that students experienced substantial variations in success within 12 large OU qualifications. As evidenced in our previous report, 55% of variation in studentsâ success over time was explained by OU institutional factors (i.e., how students were assessed within their respective module; how students were able to effectively transition from one learning design of one module to the next one), rather than studentsâ characteristics, engagement and behaviour.
We have received several queries and questions from Faculties and Units about how to better understand these studentsâ journeys, and how qualifications and module designs could be better aligned within their respective qualification(s). As these are complex conceptual and Big Pedagogy questions, in Chapter 1 we continued these complex analyses by looking at the transitional processes of the first two modules that OU students take, and how well aligned these modules and qualification paths are. In Chapter 2, we explored the more fine-grained, qualitative, and lived experiences of 19 students across a range of qualifications to understand how OU grading practices and (in)consistencies of assessment and feedback influenced their affect, behaviour, and cognition. In addition to building on previous topics, we introduced two new Scholarly insights in Chapter 3 and Chapter 4. As the OU is increasingly using learning analytics to support our staff and students, in Chapter 3 we analysed the impact of giving Predictive Learning Analytics to over 500 Associate Lecturers across 31 modules on student retention. Finally, in Chapter 4 we explored the impact of first presentations of new modules on pass rates and satisfaction, whereby we were able to bust another myth that may have profound implications for Student First Transformation.
Working organically in various Faculty sub-group meetings and LTI Units and in a google doc with various key stakeholders in the Faculties , we hope that our Scholarly insights can help to inform our staff, but also spark some ideas how to further improve our module designs and qualification pathways. Of course we are keen to hear what other topics require Scholarly insight
Linking students' timing of engagement to learning design and academic performance
In recent years, the connection between Learning Design (LD) and Learning Analytics (LA) has been emphasized by many scholars as it could enhance our interpretation of LA findings and translate them to meaningful interventions. Together with numerous conceptual studies, a gradual accumulation of empirical evidence has indicated a strong connection between how instructors design for learning and student behaviour. Nonetheless, students' timing of engagement and its relation to LD and academic performance have received limited attention. Therefore, this study investigates to what extent students' timing of engagement aligned with instructor learning design, and how engagement varied across different levels of performance. The analysis was conducted over 28 weeks using trace data, on 387 students, and replicated over two semesters in 2015 and 2016. Our findings revealed a mismatch between how instructors designed for learning and how students studied in reality. In most weeks, students spent less time studying the assigned materials on the VLE compared to the number of hours recommended by instructors. The timing of engagement also varied, from in advance to catching up patterns. High-performing students spent more time studying in advance, while low-performing students spent a higher proportion of their time on catching-up activities. This study reinforced the importance of pedagogical context to transform analytics into actionable insights
Recommended from our members
Unravelling the Temporal Process of Learning Design and Student Engagement in Distance Education using Learning Analytics
Designing a curriculum in online and distance education can be challenging because the processes of what, when, and how students study are not always visible to teachers due to the limited opportunities for face-to-face interactions. The aim of this thesis is to explore how teachers design for learning, together with how the learning design impacts upon the studentsâ actual engagement with the learning materials, with the subsequent effect on their academic performance. One way forward, is to build on the intersection between the most recent work in learning analytics and learning design research. I have therefore argued for and investigated the potential of incorporating the design of learning activities into the analysis of student learning behaviour. On the one hand, the visualisation of learning activities designed by teachers provides the pedagogical context to improve the interpreta-tion of the observed learning behaviour and its effect on academic performance. On the oth-er hand, the analysis of online digital traces of learning activities offers a dynamic account of how students learn in practice in a distance learning environment. As a result, this thesis sheds new light on the implicit process of how learning design influences student engagement in distance education
By employing a mixed-method research design, I first examined how teachers design for learning using visualisations and network analysis of 37 modules over 30 weeks at The Open University. In the next step, I conducted an in-depth qualitative investigation with 12 teachers into the underlying factors that influenced their design decisions, as well as the perceived barriers and affordances of adopting approaches from the Open University Learning Design Initiative. The findings revealed common patterns as well as variations in learning design across modules and their disciplines of study. Analysis of the interviews revealed underlying tensions between teachersâ autonomy and the influence of management and institutional policies in the design process and the adoption of learning design tools.
After laying out the foundation for understanding the learning design processes, I carried out a large-scale analysis of 37 modules and 45,190 students to examine how learning design influences student engagement, satisfaction, and performance. The findings indicated that learning design explained up to 69% of the variance in student engagement, which was strongly driven by assimilative, assessment, and communication activities. Finally, I conducted a fine-grained analysis exploring the (in)consistencies between learning design and student behaviour and how different engagement patterns impact academic performance. The analysis found misalignments between how teachers designed for learning and how students actually studied. In most weeks, students spent less time studying the assigned materials compared to the number of hours recommended by instructors. High-performing students not only studied âharderâ by spending more time, but also âsmarterâ by engaging in a timely manner.
Altogether, this thesis has contributed new scientific insights into the dynamic temporal aspects of how teachers design for learning and the relations between learning design, engagement, and academic performance in distance education. As an implication, the findings reported here demonstrated how learning design could improve the accuracy and interpretability of learning analytics models, and how learning analytics could help teachers identify potential inconsistencies between learning design and student behaviour
Recommended from our members
How do students engage with computer-based assessments: impact of study breaks on intertemporal engagement and pass rates
Study breaks and exam revision weeks are increasingly embedded in learning design under the assumption that students would make use of this time to catch up with their study or prepare for upcoming assessment tasks. However, there remains a paucity of empirical evidence to evaluate to what extent the implementation of study breaks, preparation and exam revision weeks impact studentsâ engagement and academic performance. By applying learning analytics in a Computer-Based Assessment (CBA) setting, this study investigates how study break weeks and assessment preparation weeks impacted the odds of passing a module using a mixed effect logistic regression on 123,916 undergraduate students in 205 modules over several semesters from 2015-2017 at the Open University. Furthermore, we investigated the intertemporal characteristics of student engagement during preparation weeks for a final assessment in an Introductory Business course over three semesters. A mixed-effect logistic regression was used to model behavioural engagement of 3,385 students on the VLE (i.e. click counts) over three semesters during the assessment preparation weeks. Our findings indicated a positive association between study breaks and the odds of passing a course, while there was no statistically significant effect in relation to the number of assessment preparation and revision weeks. Analysis of behavioural engagement on the VLE suggested that there was a higher proportion of passed students remained active during preparation and exam revision weeks compared to failed students. Compared to the pass group, the fail group also exhibited a stronger pattern of procrastination. This study offers new insights that could help institution management and course designer to evaluate the efficacy of using of study breaks and exam preparation weeks to improve student retention
The engagement of mature distance students
This is an Accepted Manuscript of an article published by Taylor & Francis in Higher Education Research and Development in 2013, available online: http://www.tandfonline.com/10.1080/07294360.2013.777036.Publishe
Recommended from our members
Scholarly insight Winter 2019: a Data wrangler perspective
Henry Ford famously said that âAny customer can have a car painted any colour that he wants so long as it is black.â Similarly, our Prime Minister Theresa May indicated in 2016 to aim for a âred, white and blue Brexitâ. While the Open University (OU) has been open for 50 years to all learners, we are aware that our students have unique and different learning needs, experiences, and expertise. The OU recognises that we need to carefully listen to our students, and focus on their needs. Nonetheless, in some of our narratives we tend to simplify and generalise these multiple, complex student voices into one common voice. As highlighted in all three chapters in this fifth Scholarly Insight report, working intensively together with the Faculties our Data wranglers have found strong empirical evidence that our students indeed have very unique and distinct voices, which influence their engagement, behaviour, and study success.
In Chapter 1 we worked closely together with the four Faculties to further unpack the qualitative feedback and studentsâ comments of the Student experience on a module (SEAM) survey (e.g., do Open degree students have different narratives when providing feedback; do high performing students âtalkâ differently from low performing students). Indeed our text analytics toolkit has highlighted that Open degree students speak differently from others (e.g., needing enough study time). Furthermore, higher achieving students report on different topics (e.g., content, feedback, group) than lower achievers (e.g., help, problem, experience). The OU needs to carefully balance these different voices, as addressing one concern from a high achieving student might not necessarily benefit other students, and vice-versa.
In Chapter 2 describes three approaches of students selecting different module pathways towards qualification completion. For one Open Degree programme in Creative Writing we find that 268 unique paths are taken by students, whereby some paths are more successful than others. Follow-up analyses in QUAL2F3 indicate substantial differences in pass rates and success depending on the respective route, specialism, and pathways students are taking. Sign-posting these âsuccessfulâ paths to OU staff and students may help students to make more informed decisions of what to study next.
Finally in Chapter 3 we explore how students make timing decisions when to study for a module, and how so-called study break and assessment preparation weeks could help to provide more flexibility for our students. Study breaks are weeks during which no learning activities are planned or take place, and students are not expected to study for a module. Our big data analyses with 123,916 students and 205 OU modules indicate that the way OU designs study weeks has a substantial impact in how students study over time. Study break weeks substantially increase the chances of students to pass a module, while assessment preparation weeks are not related to pass rates
We hope that our Scholarly insights can help to inform our staff, but also spark some ideas how to further improve our understanding of the different student voices and qualification pathways
- âŠ