A Differential Response Functioning Framework for Understanding Item, Bundle, and Test Bias

Abstract

This dissertation extends the parametric sampling method and area-based statistics for differential test functioning (DTF) proposed by Chalmers, Counsell, and Flora (2016). Measures for differential item and bundle functioning are first introduced as a special case of the DTF statistics. Next, these extensions are presented in concert with the original DTF measures as a unified framework for quantifying differential response functioning (DRF) of items, bundles, and tests. To evaluate the utility of the new family of measures, the DRF framework is compared to the previously established simultaneous item bias test (SIBTEST) and differential functioning of items and tests (DFIT) frameworks. A series of Monte Carlo simulation conditions were designed to estimate the power to detect differential effects when compensatory and non-compensatory differential effects are present, as well as to evaluate Type I error control. Benefits inherent to the DRF framework are discussed, extensions are suggested, and alternative methods for generating composite-level sampling variability are presented. Finally, it is argued that the area-based measures in the DRF framework provide an intuitive and meaningful quantification of marginal and conditional response bias over and above what has been offered by the previously established statistical frameworks

    Similar works