49 research outputs found
Evaluating a Learned Admission-Prediction Model as a Replacement for Standardized Tests in College Admissions
A growing number of college applications has presented an annual challenge
for college admissions in the United States. Admission offices have
historically relied on standardized test scores to organize large applicant
pools into viable subsets for review. However, this approach may be subject to
bias in test scores and selection bias in test-taking with recent trends toward
test-optional admission. We explore a machine learning-based approach to
replace the role of standardized tests in subset generation while taking into
account a wide range of factors extracted from student applications to support
a more holistic review. We evaluate the approach on data from an undergraduate
admission office at a selective US institution (13,248 applications). We find
that a prediction model trained on past admission data outperforms an SAT-based
heuristic and matches the demographic composition of the last admitted class.
We discuss the risks and opportunities for how such a learned model could be
leveraged to support human decision-making in college admissions.Comment: In Proceedings of the ACM Conference on Learning at Scale (L@S) 202
Estimating peer effects in networks with peer encouragement designs
Peer effects, in which the behavior of an individual is affected by the behavior of their peers, are central to social science. Because peer effects are often confounded with homophily and common external causes, recent work has used randomized experiments to estimate effects of specific peer behaviors. These experiments have often relied on the experimenter being able to randomly modulate mechanisms by which peer behavior is transmitted to a focal individual. We describe experimental designs that instead randomly assign individuals’ peers to encouragements to behaviors that directly affect those individuals. We illustrate this method with a large peer encouragement design on Facebook for estimating the effects of receiving feedback from peers on posts shared by focal individuals. We find evidence for substantial effects of receiving marginal feedback on multiple behaviors, including giving feedback to others and continued posting. These findings provide experimental evidence for the role of behaviors directed at specific individuals in the adoption and continued use of communication technologies. In comparison, observational estimates differ substantially, both underestimating and overestimating effects, suggesting that researchers and policy makers should be cautious in relying on them
Recommended from our members
Investigating Variation in Learning Processes in a FutureLearn MOOC
Studies on engagement and learning design in Massive Open Online Courses (MOOCs) have laid the groundwork for understanding how people learn in this relatively new type of informal learning environment. To advance our understanding of how people learn in MOOCs, we investigate the intersection between learning design and the temporal process of engagement in the course. This study investigates the detailed processes of engagement using educational process mining (EPM) in a FutureLearn science course (N = 2086 learners) and applying an established taxonomy of learning design to classify learning activities. The analyses were performed on three groups of learners categorised based upon their clicking behaviour. The process-mining results show at least one dominant pathway in each of the three groups, though multiple popular additional pathways were identified within each group. All three groups remained interested and engaged in the various learning and assessment activities. The findings from this study suggest that in the analysis of voluminous MOOC data there is value in first clustering learners and then investigating detailed progressions within each cluster that take the order and type of learning activities into account. The approach is promising because it provides insight into variation in behavioural sequences based on learners’ intentions for earning a course certificate. These insights can inform the targeting of analytics-based interventions to support learners and inform MOOC designers about adapting learning activities to different groups of learners based on their goals
Effects of Automated Interventions in Programming Assignments: Evidence from a Field Experiment
A typical problem in MOOCs is the missing opportunity for course conductors
to individually support students in overcoming their problems and
misconceptions. This paper presents the results of automatically intervening on
struggling students during programming exercises and offering peer feedback and
tailored bonus exercises. To improve learning success, we do not want to
abolish instructionally desired trial and error but reduce extensive struggle
and demotivation. Therefore, we developed adaptive automatic just-in-time
interventions to encourage students to ask for help if they require
considerably more than average working time to solve an exercise. Additionally,
we offered students bonus exercises tailored for their individual weaknesses.
The approach was evaluated within a live course with over 5,000 active students
via a survey and metrics gathered alongside. Results show that we can increase
the call outs for help by up to 66% and lower the dwelling time until issuing
action. Learnings from the experiments can further be used to pinpoint course
material to be improved and tailor content to be audience specific.Comment: 10 page
Recommended from our members
Are MOOC learning designs culturally inclusive (enough)?
Background Extensive research on massive open online courses (MOOCs) has focused on analysing learners' behavioural trace data to understand navigation and activity patterns, which are known to vary systematically across geo-cultural contexts. However, the perception of learners regarding the role of different learning design elements in sustaining their engagement in the course is still unclear. Objectives This study aimed to examine learners' perception of learning design elements in MOOCs and explore the ways in which these perceptions differ between geo-cultural contexts. Methods We conducted interviews with 22 learners from seven geo-cultural regions to gather insights into their learning design preferences. Results Our findings indicate that learners from regions such as South Asia exhibit a strong inclination towards video-based content and a lesser preference for reading textual resources. In contrast, learners from regions such as Anglo-Saxon demonstrate a high preference for reading texts such as articles and video transcripts. Conclusion The observed variations in self-reported interests in various learning design elements raise intriguing questions about the nature and extent of participation of various geo-cultural groups. This study underscores the need to develop inclusive MOOC designs and implement learning analytics approaches that adapt to the cultural preferences of learners
Recommended from our members
Aligning the Goals of Learning Analytics with its Research Scholarship: An Open Peer Commentary Approach
To promote cross-community dialogue on matters of significance within the field of learning analytics], we as editors-in- chief of the Journal of Learning Analytics have introduced a section for papers that are open to peer commentary. The first of these papers, “A LAK of Direction: Misalignment Between the Goals of Learning Analytics and its Research Scholarship” by Motz et al. (2023), appeared in the journal’s early access section in March 2023, a few days before the start of the 13th International Learning Analytics and Knowledge Conference (LAK ’23). “A LAK of Direction” takes as its starting point the definition of learning analytics used in the call for papers of the first LAK conference (LAK ’11) and used since then by the Society for Learning Analytics Research (SoLAR): “Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs” (Long & Siemens, 2011, p. 24). Following the conference, an invitation to submit proposals for commentaries on the paper was released, and 12 of these proposals were accepted. This paper brings those commentaries togethe
Aligning the Goals of Learning Analytics with its Research Scholarship: An Open Peer Commentary Approach
To promote cross-community dialogue on matters of significance within the field of learning analytics (LA), we as editors-in-chief of the Journal of Learning Analytics (JLA) have introduced a section for papers that are open to peer commentary. An invitation to submit proposals for commentaries on the paper was released, and 12 of these proposals were accepted. The 26 authors of the accepted commentaries are based in Europe, North America, and Australia. They range in experience from PhD students and early-career researchers to some of the longest-standing, most senior members of the learning analytics community. This paper brings those commentaries together, and we recommend reading it as a companion piece to the original paper by Motz et al. (2023), which also appears in this issu