Assessing Information Literacy Skills: A Rubric Approach

Abstract

Academic librarians should explore new approaches to the assessment of information literacy skills. Satisfaction surveys and input/output measures do not provide librarians with adequate information about what students know and can do. Standardized multiple-choice tests and large-scale performance assessments also fail to provide the data librarians need to improve instruction locally. Librarians, facing accountability issues and possessing the desire to improve student learning, require a new approach to library instruction assessment. This study investigated the viability of a rubric approach to information literacy assessment and examined an analytic information literacy rubric designed to assess students' ability to evaluate website authority. The study addressed these questions: (1) To what degree can different groups of raters provide consistent scoring of student learning artifacts using a rubric? (2) To what degree can raters provide scores consistent with those assigned by the researcher? (3) To what degree can students use authority as a criterion to evaluate websites? This study revealed that multiple raters can use rubrics to produce consistent scoring of information literacy artifacts of student learning; however, different groups of raters in this study arrived at varying levels of agreement. For example, ENG 101 instructors produced significantly higher reliabilities than NCSU librarians and ENG 101 students, and NCSU librarians produced remarkably higher levels of agreement than external instruction and reference librarians. In addition to providing important findings regarding the five original rater groups, this study documented the emergence of an "expert" rater group, identified through kappa statistics and a "gold standard" approach to the examination of validity. These raters not only approximated the researcher's scores, they also achieved higher levels of agreement than any of the five original groups. This study suggests that librarians may require substantial training to overcome barriers blocking expert rater status. Finally, this study found that most students can cite specific indicators of authority when evaluating a website. Nearly all students can locate and identify these authority indicators in a website. However, many students have difficulty choosing an appropriate website for a specific assignment and providing a rationale for their choice

    Similar works