36,039 research outputs found

    Guide on the Side and LibWizard Tutorials side-by-side: How do the two platforms for split-screen online tutorials compare?

    Get PDF
    Split-screen tutorials are an appealing and effective way for libraries to create online learning objects where learners interact with real-time web content. Many libraries are using the University of Arizona’s award-winning, open source platform, Guide on the Side; in 2016, Springshare released a proprietary alternative, LibWizard Tutorials. This article reviews the advantages and limitations of this kind of tutorial. It also examines the differences between each platform’s distinctive characteristics. These platforms create similar split-screen tutorials, but have differences that affect diverse aspects of installation, administration, authoring and editing, student learning, data management, and accessibility. Libraries now have the opportunity to consider and compare alternative platforms and decide which one is best suited to their needs, priorities and resources

    Usability evaluation of digital libraries: a tutorial

    Get PDF
    This one-day tutorial is an introduction to usability evaluation for Digital Libraries. In particular, we will introduce Claims Analysis. This approach focuses on the designers’ motivations and reasons for making particular design decisions and examines the effect on the user’s interaction with the system. The general approach, as presented by Carroll and Rosson(1992), has been tailored specifically to the design of digital libraries. Digital libraries are notoriously difficult to design well in terms of their eventual usability. In this tutorial, we will present an overview of usability issues and techniques for digital libraries, and a more detailed account of claims analysis, including two supporting techniques – simple cognitive analysis based on Norman’s ‘action cycle’ and Scenarios and personas. Through a graduated series of worked examples, participants will get hands-on experience of applying this approach to developing more usable digital libraries. This tutorial assumes no prior knowledge of usability evaluation, and is aimed at all those involved in the development and deployment of digital libraries

    The Structured Process Modeling Method (SPMM) : what is the best way for me to construct a process model?

    Get PDF
    More and more organizations turn to the construction of process models to support strategical and operational tasks. At the same time, reports indicate quality issues for a considerable part of these models, caused by modeling errors. Therefore, the research described in this paper investigates the development of a practical method to determine and train an optimal process modeling strategy that aims to decrease the number of cognitive errors made during modeling. Such cognitive errors originate in inadequate cognitive processing caused by the inherent complexity of constructing process models. The method helps modelers to derive their personal cognitive profile and the related optimal cognitive strategy that minimizes these cognitive failures. The contribution of the research consists of the conceptual method and an automated modeling strategy selection and training instrument. These two artefacts are positively evaluated by a laboratory experiment covering multiple modeling sessions and involving a total of 149 master students at Ghent University

    Processing techniques development

    Get PDF
    There are no author-identified significant results in this report

    "Do screen captures in manuals make a difference?": a comparison between textual and visual manuals

    Get PDF
    Examines the use of screen captures in manuals. Three types of manuals were compared: one textual and two visual. The two visual manuals differed in the type of screen capture that was used. One had screen captures that showed only the relevant part of the screen, whereas the other consisted of captures of the full screen. All manuals contained exactly the same textual information. We examined immediate use on time (use as a job aid) and on learning (use as a teacher). For job-aid purposes, there was no difference between the manuals. The visual manual with full-screen captures and the textual manual were both better for learning than the visual manual with partial screen captures. We found no effect on user motivation. The tentative conclusion of this study is that screen captures seem not to be vital for learning or immediate use. If one opts for including screen captures, then the conclusion is that full-screen captures are better than partial one

    Assessment of the minimalist approach to computer user documentation

    Get PDF
    The minimalist approach (Carroll, 1990a) advocates the development of a radically different type of manual when compared to a conventional one. For example, the manual should proceed almost directly to procedural skills development rather than building a conceptual model first. It ought to focus on authentic tasks practised in context, as opposed to mock exercises and isolated practice. In addition, it should stimulate users to exploit their knowledge and thinking, as opposed to imposing the writer's view and discussing everything that users should see or know.\ud \ud In the first part of the paper the construction of a tutorial based on the minimalist principles is described. A parallel is drawn with constructivism with which minimalism shares important notions of instruction. In the second part, an experiment is described in which the minimal manual was tested against a conventional one. The outcome favoured the new manual. For example, minimal manual users completed about 50% more tasks successfully on a performance test and displayed significantly more self-reliance (e.g. more self-initiated error-recoveries, and fewer manual consultations)

    Lucene4IR: Developing information retrieval evaluation resources using Lucene

    Get PDF
    The workshop and hackathon on developing Information Retrieval Evaluation Resources using Lucene (L4IR) was held on the 8th and 9th of September, 2016 at the University of Strathclyde in Glasgow, UK and funded by the ESF Elias Network. The event featured three main elements: (i) a series of keynote and invited talks on industry, teaching and evaluation; (ii) planning, coding and hacking where a number of groups created modules and infrastructure to use Lucene to undertake TREC based evaluations; and (iii) a number of breakout groups discussing challenges, opportunities and problems in bridging the divide between academia and industry, and how we can use Lucene for teaching and learning Information Retrieval (IR). The event was composed of a mix and blend of academics, experts and students wanting to learn, share and create evaluation resources for the community. The hacking was intense and the discussions lively creating the basis of many useful tools but also raising numerous issues. It was clear that by adopting and contributing to most widely used and supported Open Source IR toolkit, there were many benefits for academics, students, researchers, developers and practitioners - providing a basis for stronger evaluation practices, increased reproducibility, more efficient knowledge transfer, greater collaboration between academia and industry, and shared teaching and training resources
    • 

    corecore