1 research outputs found

    Question Generation for Adaptive Assessment for Student Knowledge Modeling in Probabilistic Domains

    No full text
    Abstract. In this paper a question generation approach for adaptive assessment is purposed to estimate the student knowledge model in a probabilistic domain within an intelligent tutoring system. Assessing questions are generated adaptively according to the student knowledge based on two factors (i) the student misconceptions that are entailed in the student knowledge model, and (ii) the information gain maximizing. Updating and verification of the student model are conducted based on the matching between the student’s and model answers to assessing questions. Comparison between using the adapted questions and random questions is investigated. Results suggest that utilizing adapted generated questions increases the approximation accuracy of the student model by 40 % in addition to decreasing of the required assessing questions by 50 % compared to using fixed questions
    corecore