11 research outputs found
Recall or transfer? How assessment types drive text-marking behavior
IntroductionText marking is a widely used study technique, valued for its simplicity, and perceived benefits in enhancing recall and comprehension. This exploratory study investigates its role as an encoding mechanism, focusing on how marking impacts recall and transfer when learners are oriented toward different posttest items (recall or transfer).MethodWe gathered detailed data describing what learners were studying and how much they marked during studying. Participants were randomly assigned to one of four groups in a 2 × 2 factorial design. One independent variable, examples, determined whether participants were trained using examples of the types of information required to answer posttest items. The other independent variable, orientation, determined whether participants were instructed to prepare for a recall test or for an application (transfer) test.ResultsStatistical analysis revealed a detectable effect of study orientation (transfer vs. recall), F = 2.076, p = 0.043, partial η2 = 0.114. Compared to learners oriented to study for recall, learners oriented to study for transfer marked information identified as examples (F = 3.881, p = 0.051, partial η2 = 0.028), main ideas (F = 7.348, p = 0.008, partial η2 = 0.051), and reasons (F = 5.440, p = 0.021, partial η2 = 0.038). Moreover, a statistically detectable proportional relationship was found between total marking and transfer performance (F = 5.885, p = 0.017, partial η2 = 0.042). Learners who marked more scored higher on transfer questions. Prior knowledge mediated approximately 52% of the effect, indicating that as prior knowledge increased, so did the frequency of marking.DiscussionOrienting to study for a particular type of posttest item affected studying processes, specifically, how much learners marked and the categories of information they marked. While the frequency of marking was proportional to achievement, orienting to study for recall versus transfer posttest items had no effect on recall or transfer. Prior knowledge powerfully predicted how much learners marked text
Effects of Using a Study Diary on Learners' Motivational Profiles and Metacognition (Poster 17)
Author response for "Automatic identification of knowledge‐transforming content in argument essays developed from multiple sources"
What if learning analytics were based on learning science?
Learning analytics are often formatted as visualisations developed from traced data collected as students study in online learning environments. Optimal analytics inform and motivate students’ decisions about adaptations that improve their learning. We observe that designs for learning often neglect theories and empirical findings in learning science that explain how students learn. We present six learning analytics that reflect what is known in six areas (we call them cases) of theory and research findings in the learning sciences: setting goals and monitoring progress, distributed practice, retrieval practice, prior knowledge for reading, comparative evaluation of writing, and collaborative learning. Our designs demonstrate learning analytics can be grounded in research on self-regulated learning and self-determination. We propose designs for learning analytics in general should guide students toward more effective self-regulated learning and promote motivation through perceptions of autonomy, competence, and relatedness.</jats:p
nStudy: Software for Learning Analytics about Processes for Self-Regulated Learning
Data used in learning analytics rarely provide strong and clear signals about how learners process content. As a result, learning as a process is not clearly described for learners or for learning scientists. Gašević, Dawson, and Siemens (2015) urged data be sought that more straightforwardly describe processes in terms of events within learning episodes. They recommended building on Winne’s (1982) characterization of traces — ambient data gathered as learners study that more clearly represent which operations learners apply to which information — and his COPES model of a learning event — conditions, operations, products, evaluations, standards (Winne, 1997). We designed and describe an open source, open access, scalable software system called nStudy that responds to their challenge. nStudy gathers data that trace cognition, metacognition, and motivation as processes that are operationally captured as learners operate on information using nStudy’s tools. nStudy can be configured to support learners’ evolving self-regulated learning, a process akin to personally focused, self-directed learning science.</jats:p
Designs for learning analytics to support information problem solving
The chapter is part of a book that provides a multidisciplinary view into how individuals and groups interact with the information environments that surround them. The book discusses how informational environments shape our daily lives, and how digital technologies can improve the ways in which people make use of informational environments. The chapter focuses on learning designs in education (from an educational psychology perspective) that enhance learning.Nashaat-Sobhy, N.; Winne, PH.; Vytasek, JM.; Patzak, A.; Rakovic, M.; Marzouk, Z.; Pakdaman-Savoji, A.... (2017). Designs for learning analytics to support information problem solving. En Informational Environments: Effects of Use, Effective Designs. Springer. 249-272. http://hdl.handle.net/10251/20399224927
