4 research outputs found
Distractors and item response theory
The focus of this work is the evaluation and modelling of an educational or psychological test, designed to measure a single, intrinsically unobservable ability, U, which may be taken as representing one or more dimensions, and is referred to as the latent trait. In particular, there appears to be a clear need for diagnostic tools for determining the quality of multiple-choice items, and by extension, tests. A focus of the work is on the development of methods for assessing whether a given set of distractors for an item displays appropriate behavior in examinee responses. A nonparametric latent variable structure for multi-categorical item responses is proposed, allowing for more exact specification of appropriate distractor behavior in a variety of circumstances. We study a class of latent variable representations for responses to multiple-choice test items. Under standard assumptions of conditional independence and item characteristic curve monotonicity, we consider some possible criteria for good distractors. A criterion for wrong answers is proposed and defended based on simple distractor selection ratios. The main result allows for the testing of the rising selection ratios criterion without first specifying a parametric form for the characteristic curves. A series of examples apply the methods. Finally, some study is made of a two-distractor selection ratio, in light of several non-parametric assumptions that may be made about it