5 research outputs found
Generation of assessment questions from textbooks enriched with knowledge models
Augmenting digital textbooks with assessment material improves their effectiveness as learning tools. It can be a laborious task requiring considerable amount of time and expertise. This paper presents an automated assessment generation tool that works as a component of the Intextbooks platform. Intextbooks extracts fine-grained knowledge models from PDF textbooks and converts them into semantically annotated learning resources. With the help of the developed assessment components, these textbooks become interactive educational tools capable to assess students' knowledge of relevant concepts. The results of an expert-based pilot evaluation show that generated questions are properly worded and have a good range in term of difficulty. From the point of assessment value, some generated questions types fall behind manually constructed assessment, while others obtain comparable results
Generation of assessment questions from textbooks enriched with knowledge models
Augmenting digital textbooks with assessment material improves their effectiveness as learning tools. It can be a laborious task requiring considerable amount of time and expertise. This paper presents an automated assessment generation tool that works as a component of the Intextbooks platform. Intextbooks extracts fine-grained knowledge models from PDF textbooks and converts them into semantically annotated learning resources. With the help of the developed assessment components, these textbooks become interactive educational tools capable to assess students' knowledge of relevant concepts. The results of an expert-based pilot evaluation show that generated questions are properly worded and have a good range in term of difficulty. From the point of assessment value, some generated questions types fall behind manually constructed assessment, while others obtain comparable results
EMG-based Feedback Modulation for Increased Transparency in Teleoperation
In interacting with stiff environments through teleoperated systems, time delays cause a mismatch between haptic feedback and the expected feedback by the operator. This mismatch causes artefacts in the feedback, which decrease transparency, but so does filtering these artefacts. Through modelling of operator stiffness and the expected feedback force with EMG, the artifacts can be selectively filtered without loss of transparency. We developed several feedback modulation techniques to bring the feedback force closer to the expected force: 1) the average between the modelled operator force and the feedback force, 2) a low pass filter and 3) a scaling modulation. To control for overdamping, a transparency check is included. We show that the averaging approach yields significantly better contacts than unmodulated feedback. None of the modulation algorithms differ significantly from the unmodulated feedback in transparency
Generation of assessment questions from textbooks enriched with knowledge models
Augmenting digital textbooks with assessment material improves their effectiveness as learning tools. It can be a laborious task requiring considerable amount of time and expertise. This paper presents an automated assessment generation tool that works as a component of the Intextbooks platform. Intextbooks extracts fine-grained knowledge models from PDF textbooks and converts them into semantically annotated learning resources. With the help of the developed assessment components, these textbooks become interactive educational tools capable to assess students' knowledge of relevant concepts. The results of an expert-based pilot evaluation show that generated questions are properly worded and have a good range in term of difficulty. From the point of assessment value, some generated questions types fall behind manually constructed assessment, while others obtain comparable results
Generation of assessment questions from textbooks enriched with knowledge models
Augmenting digital textbooks with assessment material improves their effectiveness as learning tools. It can be a laborious task requiring considerable amount of time and expertise. This paper presents an automated assessment generation tool that works as a component of the Intextbooks platform. Intextbooks extracts fine-grained knowledge models from PDF textbooks and converts them into semantically annotated learning resources. With the help of the developed assessment components, these textbooks become interactive educational tools capable to assess students' knowledge of relevant concepts. The results of an expert-based pilot evaluation show that generated questions are properly worded and have a good range in term of difficulty. From the point of assessment value, some generated questions types fall behind manually constructed assessment, while others obtain comparable results