116 research outputs found
Effects of self-assessment feedback on self-assessment and task-selection accuracy
Effective self-regulated learning in settings in which students can decide what tasks to work
on, requires accurate self-assessment (i.e., a judgment of own level of performance) as well as
accurate task selection (i.e., choosing a subsequent task that fits the current level of performance). Because self-assessment accuracy is often low, task-selection accuracy suffers as well
and, consequently, self-regulated learning can lead to suboptimal learning outcomes. Recent
studies have shown that a training with video modeling examples enhanced self-assessment
accuracy on problem-solving tasks, but the training was not equally effective for every student
and, overall, there was room for further improvement in self-assessment accuracy. Therefore,
we investigated whether training with video examples followed by feedback focused on selfassessment accuracy would improve subsequent self-assessment and task-selection accuracy in
the absence of the feedback. Experiment 1 showed, contrary to our hypothesis, that selfassessment feedback led to less accurate future self-assessments. In Experiment 2, we provided
students with feedback focused on self-assessment accuracy plus information on the correct
answers, or feedback focused on self-assessment accuracy, plus the correct answers and the
opportunity to contrast those with their own answers. Again, however, we found no beneficial
effect of feedback on subsequent self-assessment accuracy. In sum, we found no evidence that
feedback on self-assessment accuracy improves subsequent accuracy. Therefore, future research should address other ways improving accuracy, for instance by taking into account the
cues upon which students base their self-assessments
Effects of self-assessment feedback on self-assessment and task-selection accuracy
Effective self-regulated learning in settings in which students can decide what tasks to work
on, requires accurate self-assessment (i.e., a judgment of own level of performance) as well as
accurate task selection (i.e., choosing a subsequent task that fits the current level of performance). Because self-assessment accuracy is often low, task-selection accuracy suffers as well
and, consequently, self-regulated learning can lead to suboptimal learning outcomes. Recent
studies have shown that a training with video modeling examples enhanced self-assessment
accuracy on problem-solving tasks, but the training was not equally effective for every student
and, overall, there was room for further improvement in self-assessment accuracy. Therefore,
we investigated whether training with video examples followed by feedback focused on selfassessment accuracy would improve subsequent self-assessment and task-selection accuracy in
the absence of the feedback. Experiment 1 showed, contrary to our hypothesis, that selfassessment feedback led to less accurate future self-assessments. In Experiment 2, we provided
students with feedback focused on self-assessment accuracy plus information on the correct
answers, or feedback focused on self-assessment accuracy, plus the correct answers and the
opportunity to contrast those with their own answers. Again, however, we found no beneficial
effect of feedback on subsequent self-assessment accuracy. In sum, we found no evidence that
feedback on self-assessment accuracy improves subsequent accuracy. Therefore, future research should address other ways improving accuracy, for instance by taking into account the
cues upon which students base their self-assessment
Training selfâassessment and taskâselection skills to foster selfâregulated learning: Do trained skills transfer across domains?
Students' ability to accurately selfâassess their performance and select a suitable subsequent
learning task in response is imperative for effective selfâregulated learning. Video modeling examples have proven effective for training selfâassessment and taskâselection skills, andâimportantly
âsuch training fostered selfâregulated learning outcomes. It is unclear, however, whether trained
skills would transfer across domains. We investigated whether skills acquired from training with
either a specific, algorithmic taskâselection rule or a more general heuristic taskâselection rule in
biology would transfer to selfâregulated learning in math. A manipulation check performed after
the training confirmed that both algorithmic and heuristic training improved taskâselection skills
on the biology problems compared with the control condition. However, we found no evidence
that students subsequently applied the acquired skills during selfâregulated learning in math.
Future research should investigate how to support transfer of taskâselection skills across domains
S-COL: A Copernican turn for the development of flexibly reusable collaboration scripts
Collaboration scripts are usually implemented as parts of a particular collaborative-learning platform. Therefore, scripts of demonstrated effectiveness are hardly used with learning platforms at other sites, and replication studies are rare. The approach of a platform-independent description language for scripts that allows for easy implementation of the same script on different platforms has not succeeded yet in making the transfer of scripts feasible. We present an alternative solution that treats the problem as a special case of providing support on top of diverse Web pages: In this case, the challenge is to trigger support based on the recognition of a Web page as belonging to a specific type of functionally equivalent pages such as the search query form or the results page of a search engine. The solution suggested has been implemented by means of a tool called S-COL (Scripting for Collaborative Online Learning) and allows for the sustainable development of scripts and scaffolds that can be used with a broad variety of content and platforms. The toolâs functions are described. In order to demonstrate the feasibility and ease of script reuse with S-COL, we describe the flexible re-implementation of a collaboration script for argumentation in S-COL and its adaptation to different learning platforms. To demonstrate that a collaboration script implemented in S-COL can actually foster learning, an empirical study about the effects of a specific script for collaborative online search on learning activities is presented. The further potentials and the limitations of the S-COL approach are discussed
Training self-regulated learning skills with video modeling examples: Do task-selection skills transfer?
Self-assessment and task-selection skills are crucial in self-regulated learning
situations in which students can choose their own tasks. Prior research suggested that
training with video modeling examples, in which another person (the model) demonstrates
and explains the cyclical process of problem-solving task performance, self-assessment,
and task-selection, is effective for improving adolescentsâ problem-solving posttest performance after self-regulated learning. In these examples, the models used a specific taskselection algorithm in which perceived mental effort and self-assessed performance scores
were combined to determine the complexity and support level of the next task, selected
from a task database. In the present study we aimed to replicate prior findings and to
investigate whether transfer of task-selection skills would be facilitated even more by a
more general, heuristic task-selection training than the task-specific algorithm. Transfer of
task-selection skills was assessed by having students select a new task in another domain
for a fictitious peer student. Results showed that both heuristic and algorithmic training of
self-assessment and task-selection skills improved problem-solving posttest performance
after a self-regulated learning phase, as well as transfer of task-selection skills. Heuristic
training was not more effective for transfer than algorithmic training. These findings show
that example-based self-assessment and task-selection training can be an effective and relatively easy to implement method for improving studentsâ self-regulated learning outcomes. Importantly, our data suggest that the effect on task-selection skills may transfer
beyond the trained tasks, although future research should establish whether this also
applies when trained students perform novel tasks themselves
E-learning interventions are comparable to user's manual in a randomized trial of training strategies for the AGREE II
<p>Abstract</p> <p>Background</p> <p>Practice guidelines (PGs) are systematically developed statements intended to assist in patient and practitioner decisions. The AGREE II is the revised tool for PG development, reporting, and evaluation, comprised of 23 items, two global rating scores, and a new User's Manual. In this study, we sought to develop, execute, and evaluate the impact of two internet interventions designed to accelerate the capacity of stakeholders to use the AGREE II.</p> <p>Methods</p> <p>Participants were randomized to one of three training conditions. 'Tutorial'--participants proceeded through the online tutorial with a virtual coach and reviewed a PDF copy of the AGREE II. 'Tutorial + Practice Exercise'--in addition to the Tutorial, participants also appraised a 'practice' PG. For the practice PG appraisal, participants received feedback on how their scores compared to expert norms and formative feedback if scores fell outside the predefined range. <it>'</it>AGREE II User's Manual PDF (control condition)'<it>--</it>participants reviewed a PDF copy of the AGREE II only. All participants evaluated a test PG using the AGREE II. Outcomes of interest were learners' performance, satisfaction, self-efficacy, mental effort, time-on-task, and perceptions of AGREE II.</p> <p>Results</p> <p>No differences emerged between training conditions on any of the outcome measures.</p> <p>Conclusions</p> <p>We believe these results can be explained by better than anticipated performance of the AGREE II PDF materials (control condition) or the participants' level of health methodology and PG experience rather than the failure of the online training interventions. Some data suggest the online tools may be useful for trainees new to this field; however, this requires further study.</p
- âŠ