144,815 research outputs found

    Evaluating Digital Libraries: A Longitudinal and Multifaceted View

    Get PDF
    published or submitted for publicatio

    Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies

    Get PDF
    A systematic search of the research literature from 1996 through July 2008 identified more than a thousand empirical studies of online learning. Analysts screened these studies to find those that (a) contrasted an online to a face-to-face condition, (b) measured student learning outcomes, (c) used a rigorous research design, and (d) provided adequate information to calculate an effect size. As a result of this screening, 51 independent effects were identified that could be subjected to meta-analysis. The meta-analysis found that, on average, students in online learning conditions performed better than those receiving face-to-face instruction. The difference between student outcomes for online and face-to-face classes—measured as the difference between treatment and control means, divided by the pooled standard deviation—was larger in those studies contrasting conditions that blended elements of online and face-to-face instruction with conditions taught entirely face-to-face. Analysts noted that these blended conditions often included additional learning time and instructional elements not received by students in control conditions. This finding suggests that the positive effects associated with blended learning should not be attributed to the media, per se. An unexpected finding was the small number of rigorous published studies contrasting online and face-to-face learning conditions for K–12 students. In light of this small corpus, caution is required in generalizing to the K–12 population because the results are derived for the most part from studies in other settings (e.g., medical training, higher education)

    Guide on the Side and LibWizard Tutorials side-by-side: How do the two platforms for split-screen online tutorials compare?

    Get PDF
    Split-screen tutorials are an appealing and effective way for libraries to create online learning objects where learners interact with real-time web content. Many libraries are using the University of Arizona’s award-winning, open source platform, Guide on the Side; in 2016, Springshare released a proprietary alternative, LibWizard Tutorials. This article reviews the advantages and limitations of this kind of tutorial. It also examines the differences between each platform’s distinctive characteristics. These platforms create similar split-screen tutorials, but have differences that affect diverse aspects of installation, administration, authoring and editing, student learning, data management, and accessibility. Libraries now have the opportunity to consider and compare alternative platforms and decide which one is best suited to their needs, priorities and resources

    What Do Freshmen Really Know about Research? Assess before You Teach

    Get PDF
    The article describes an effort to assess the information literacy skills of entering first-year college students. An instrument was developed and information was gathered on students\u27 experience and comfort in conducting library research as well as their perceived competence with specific information literacy skills. In addition, students completed a skills test to assess specific knowledge and skills relating to information literacy. Entering first-year students generally self-reported their skills to be less than excellent. This finding was supported by the results of the skills test. Strengths and weaknesses in information literacy skills are reported, as well as implications for librarians who assess and teach these skills to students

    Open Access Scientometrics and the UK Research Assessment Exercise

    No full text
    Scientometric predictors of research performance need to be validated by showing that they have a high correlation with the external criterion they are trying to predict. The UK Research Assessment Exercise (RAE) -- together with the growing movement toward making the full-texts of research articles freely available on the web -- offer a unique opportunity to test and validate a wealth of old and new scientometric predictors, through multiple regression analysis: Publications, journal impact factors, citations, co-citations, citation chronometrics (age, growth, latency to peak, decay rate), hub/authority scores, h-index, prior funding, student counts, co-authorship scores, endogamy/exogamy, textual proximity, download/co-downloads and their chronometrics, etc. can all be tested and validated jointly, discipline by discipline, against their RAE panel rankings in the forthcoming parallel panel-based and metric RAE in 2008. The weights of each predictor can be calibrated to maximize the joint correlation with the rankings. Open Access Scientometrics will provide powerful new means of navigating, evaluating, predicting and analyzing the growing Open Access database, as well as powerful incentives for making it grow faster

    Pragmatic meta analytic studies: learning the lessons from naturalistic evaluations of multiple cases

    Get PDF
    This paper explores the concept of pragmatic meta‐analytic studies in eLearning. Much educational technology literature focuses on developers and teachers describing and reflecting on their experiences. Few connections are made between these experiential ‘stories’. The data set is fragmented and offers few generalisable lessons. The field needs guidelines about what can be learnt from such single‐case reports. The pragmatic meta‐analytic studies described in this paper have two common aspects: (1) the cases are related in some way, and (2) the data are authentic, that is, the evaluations have followed a naturalistic approach. We suggest that examining a number of such cases is best done by a mixed‐methods approach with an emphasis on qualitative strategies. In the paper, we overview 63 eLearning cases. Three main meta‐analytic strategies were used: (1) meta‐analysis of the perception of usefulness across all cases, (2) meta‐analysis of recorded benefits and challenges across all cases, and (3) meta‐analysis of smaller groups of cases where the learning design and/or use of technology are similar. This study indicated that in Hong Kong the basic and non‐interactive eLearning strategies are often valued by students, while their perceptions of interactive strategies that are potentially more beneficial fluctuate. One possible explanation relates to the level of risk that teachers and students are willing to take in venturing into more innovative teaching and learning strategies
    corecore