The effects of item preview on video-based multiple-choice listening assessments

Abstract

Multiple-choice formats remain a popular design for assessing listening comprehension, yet no consensus has been reached on how multiple-choice formats should be employed. Some researchers argue that test takers must be provided with a preview of the items prior to the input (Buck, 1995; Sherman, 1997); others argue that a preview may decrease the authenticity of the task by changing the way input is processed (Hughes, 2003). Using stratified random sampling techniques, more and less proficient Japanese university English learners (N = 206) were assigned one of three test conditions: preview of question stem and answer options (n = 67), preview of question stem only (n = 70), and no preview (n = 69). A two-way ANOVA, with test condition and listening proficiency level as independent variables and score on the multiple-choice listening test as the dependent variable, indicated that the amount of item preview affected test scores but did not affect high and low proficiency students’ scores differently. Item-level analysis identified items that were harder or easier than expected for one or more of the conditions, and the researchers posit three possible sources for these unexpected findings: 1) frequency of options in the input, 2) location of item focus, and 3) presence of organizational markers

    Similar works