1,060 research outputs found

    Formalising information skills training within the curriculum: a research project at Southampton Solent University

    Get PDF
    In an increasingly competitive graduate market, information literacy (IL) has gained importance as students’ progress through university and prepare for employment. The aim of the study was to evaluate the Information Literacy Test (ILT) developed by James Madison University (JMU). Eighty-nine, level four students from the Faculty of Business, Sport and Enterprise completed the ILT. Student impressions of the test were obtained upon completion. The mean test score was (x± SD) 56 ± 15%. Analysis suggested that standards 2 and 5 were areas of particular concern. Student feedback suggested question format and layout were popular, although subject specific questions were preferred. In addition the number of test questions should be reduced. Whilst the ILT was comprehensive, the format of the test and language used was possibly not conducive with UK HE institutions. Therefore the research team plan to formulate a Solent ILT based on the SCONUL seven pillars

    LSDA responds: towards a unified e-learning strategy

    Get PDF

    Adaptation, translation, and validation of information literacy assessment instrument

    Get PDF
    The assessment of information literacy (IL) at the school level is mainly dependent on the measurement tools developed by the Western world. These tools need to be efficiently adapted and in most cases translated to allow them to be utilized in other cultures, languages, and countries. To date, there have been no standard guidelines to adapt these tools; hence, the results may be cross-culturally generalized to a certain extent. Furthermore, most data analyses produce generic outcomes without taking into account the ability of the students, including the difficulty of the test items. The present study proposes a systematic approach for context adaptation and language translation of the preexisting IL assessment tool known as TRAILS-9 to be used in different languages and context, particularly a Malaysian public secondary school. This study further administers a less common psychometric approach, the Rasch analysis, to validate the adapted instrument. This technique produces a hierarchy of item difficulty within the assessment domain that enables the ability level of the students to be differentiated based on item difficulty. The recommended scale adaptation guidelines are able to reduce the misinterpretation of scores from instruments in multiple languages as well as contribute to parallel development of IL assessment among secondary school students from different populations

    Initial Development of a Medical Information Literacy Questionnaire

    Get PDF
    Originating from the field of library science, information literacy (IL) is defined as a broad set of skills and abilities necessary to locate, evaluate and use information ethically and legally. This important skill set is incorporated into general competency requirements for postgraduate residency programs, however no standardized instrument currently exists to measure resident physician IL knowledge and skills. This study addresses that gap by developing and pilot testing an instrument aimed at measuring information literacy competence in resident physicians. The author constructed a questionnaire of sixty-nine multiple-choice items to assess skills covering five IL domains. Utilizing the Content Validity Ratio (CVR) methodology, validity evidence of the test content was evaluated by a panel of twenty physicians and five health sciences librarians. A draft instrument was administered to a convenience sample of resident physicians at the University of New Mexico. Psychometric properties of the test scores were evaluated using item analyses. Data from the item analyses was used to guide the item retention process. Each item was reviewed for corrected item-total correlation value to gauge level of item discrimination and P-values for item difficulty. Cronbachs alpha-if-item-deleted, CVR scores established by the validity panel, and the test blueprint were also considered. Based on the analyses, 32 items (46%) were eliminated from the original pool of 69 items resulting in a revised instrument containing 37 items. This study adds to the knowledge base of information literacy and graduate medical education assessment and continues the effort toward creating effective measurement tools in library science and physician education.\u2

    Information Literacy Assessment: A Review of Objective and Interpretive Measures

    Get PDF
    Information literacy has been recognized as a critical skill by professional associations and regional accrediting bodies. Consequently, institutions are increasingly integrating information literacy instruction into the academic curriculum, in turn creating the need to assess instructional impact. However, information literacy is a relatively new concept and credible assessment tools are only now forthcoming. This paper summarizes several information literacy assessment tools recent to the market, including three instruments that measure cognitive knowledge of information literacy skills at the general education level and a test that measures knowledge of information sources and structures pertinent to the field of education. Information literacy has roots in library instruction and two techniques derived from bibliometrics, a library and information science research method, are also presented

    Factors influencing the Information Literacy of Students: Preliminary Analysis

    Get PDF
    Our changing society is forcing higher education to improve teaching habits in the context of higher level of information literacy (IL) among students. IL is necessary not for only education but is a skill needed for successful engagement in professional and private life. An IL test and a survey on information and communication technology (ICT) usage were conducted among students from seven different faculties in Slovenia. The presented research in progress presents a preliminary analysis of the IL testing and ICT usage among students, to propose the model of factors influencing the level of students’ IL skills. According to the results, there are differences in IL, but they do not depend on the origin (faculty) of the student. ICT devices and applications usage could be an appropriate predictor of IL

    The Relationship between Self-Directed Learning and Information Literacy among Adult Learners in Higher Education

    Get PDF
    The purpose of this study was to investigate the relationship between self-directed learning and information literacy. Participants completed the Personal Orientation in Self-Directed Learning Scale ([PRO-SDLS], Stockdale, 2003) and the Information Literacy Test ([ILT], James Madison University, 2003). The PRO-SDLS is a self-report scale consisting of 25 statements about self-directed learning preferences in college classrooms. The ILT is a 60-item multiple-choice test that assesses the information literacy skills of college students. Correlation, ANOVA, and multiple regressions were used to test relationships and differences between self-directed learning and information literacy. Despite claims that teaching information literacy creates self-directed learners, composite scores on the PRO-SDLS and the ILT indicated no statistically significant relationship exists. Likewise, no statistically significant differences were found between the bachelors, masters, or doctoral level participant scores. While composite scores on the PRO-SDLS did not predict scores on the ILT, there was a negative, statistically significant relationship between the Initiative factor on the PRO-SDLS and ACRL (2000) Information Literacy Competency Standard 5 – Ethics & Understanding sub-scale of the ILT. Implications for practice and suggestions for further research are proposed along with discussions and conclusions

    Why would they try? Motivation and motivating in low-stakes information skills testing

    Get PDF
    In 2008 the University of Nevada Las Vegas (UNLV) University Libraries piloted the Educational Testing Service’s standardised test of information, communication, and technology (ICT) skills (iSkills) in spring and autumn 2008. In the course of administering the test we explored motivational strategies, a critical component in low-stakes, low-personal-consequences testing. Motivational strategies included providing feedback on test performance, highlighting the value of the test for the individual student, and appealing to the student’s willingness to improve the overall performance of the institution. We addressed ways to motivate students in order to enhance their level of participation in and performance on the test. As the use of standardised testing to benchmark student information skills is increasing within the information literacy community, it is vital to address these motivational aspects to ensure the generation of reliable data. This article describes the strategies and language the University Libraries used to convey value and stimulate interest; it also provides feedback from test-takers on why they tried to do their best on the test

    Multiple Partnerships for Student Information Literacy: Library, Writing Center, Faculty, and Administrators

    Get PDF
    In May, 2007, a University of Central Florida regional campus team comprised of teaching faculty, librarians, administrators, and writing center coordinators received a three year Quality Enhancement Plan grant to study the impact of a library/writing center partnership on student information literacy. This presentation will share our project’s results and benefits. Using the ACRL Information Literacy Standards, the team developed modifications and interventions designed to improve students’ ability to gather, evaluate, and use information, and to enhance their technology literacy and critical thinking. The project’s development included ongoing discussions of progress, obstacles, program collaboration, and single location of services. Targeted student interventions included group workshops and one-on-one writing center/librarian sessions. The James Madison University Information Literacy Test, a research paper evaluation, and a student perception survey were used for assessment. Benefits included enhanced academic collaboration and the establishment and expansion of a successful writing center. The results should have broad application for other institutions
    corecore