12 research outputs found

    Improving the modelling of response variation in international large-scale assessments

    Get PDF
    International large-scale assessments (ILSAs) play a major role in the evaluation of educational systems. These projects are characterized by the standardized assessment of student achievement and the collection of contextual data by means of curriculum, student, teacher, school, and home questionnaires. Together, the resulting high-quality data on student achievement and contextual factors provide great opportunities for more theory-oriented educational effectiveness research, particularly in international contexts. To ensure the validity of analyses based on these data, particularly relating to measurement invariance across (sub)populations, efforts must be made to evaluate response behaviour across (sub)populations of interest. A lack of measurement invariance characterized by these differences in response behaviour, is called differential item functioning (DIF). This thesis presents five studies that contribute to research in the field of education by deploying ILSA data in research areas where the availability of standardized data from multiple countries offers new research opportunities. Topics addressed are: computer and information literacy, parental involvement and reading literacy, and language demand in testing mathematics. Also, in each chapter methods for identifying and handling potential DIF in the framework of item response theory are explored. The studies in this thesis show how DIF analyses can be insightful by benefiting from the synergy between a methodological focus on validity and a focus on more substantive research questions. More than simply a task to tick off before the “real” questions are investigated, DIF analyses can lead to insights into effects underlying test results. Throughout the studies in this thesis it is therefore shown how, in studies with a substantive interest in comparing groups, the study of validity on both test and questionnaire items should be integrated into the methodology. Though no clear-cut one-method-fits-all strategy is presented here, the thesis shows that there are many ways to approach the issue

    Gender differences in computer and information literacy: An exploration of the performances of girls and boys in ICILS 2013

    Get PDF
    IEA’s International Computer and Information Literacy Study (ICILS) 2013 showed that in the majority of the participating countries, 14-year-old girls outperformed boys in computer and information literacy (CIL): results that seem to contrast with the common view of boys having better computer skills. This study used the ICILS data to explore whether the achievement test used in this study addressed specific dimensions of CIL and, if so, whether the performances of girls and boys on these subscales differ. We investigated the hypothesis that gender differences in performance on computer literacy items would be slightly in favour of boys, whereas gender differences in performance on information literacy items would be slightly in favour of girls. Furthermore, it was examined whether such differences varied across European countries and if item bias was present. Data was analysed using a confirmative factor analysis model, i.e. a multidimensional item response theory model, for the identification of the subscales, the explorations of gender and national differences, and possible item bias. To a large extent the results support our postulated hypothesis and shed new light on the commonly assumed disadvantaged position of girls and women in modern information society
    corecore