244,675 research outputs found

    Orchestrating Metadata Enhancement Services: Introducing Lenny

    Full text link
    Harvested metadata often suffers from uneven quality to the point that utility is compromised. Although some aggregators have developed methods for evaluating and repairing specific metadata problems, it has been unclear how these methods might be scaled into services that can be used within an automated production environment. The National Science Digital Library (NSDL), as part of its work with INFOMINE, has developed a model of ser-vice interaction that enables loosely-coupled third party services to provide metadata enhancements to a central repository, with interactions orchestrated by a centralized software application

    Challenges to Teaching Credibility Assessment in Contemporary Schooling

    Get PDF
    Part of the Volume on Digital Media, Youth, and CredibilityThis chapter explores several challenges that exist to teaching credibility assessment in the school environment. Challenges range from institutional barriers such as government regulation and school policies and procedures to dynamic challenges related to young people's cognitive development and the consequent difficulties of navigating a complex web environment. The chapter includes a critique of current practices for teaching kids credibility assessment and highlights some best practices for credibility education

    Introduction : user studies for digital library development

    Get PDF
    Introductory chapter to the edited collection on user studies in digital library development. Contains a general introduction to the topic and biographical sketches of the contributors.peer-reviewe

    What is usability in the context of the digital library and how can it be measured?

    Get PDF
    This paper reviews how usability has been defined in the context of the digital library, what methods have been applied and their applicability, and proposes an evaluation model and a suite of instruments for evaluating usability for academic digital libraries. The model examines effectiveness, efficiency, satisfaction, and learnability. It is found that there exists an interlocking relationship among effectiveness, efficiency, and satisfaction. It also examines how learnability interacts with these three attributes

    Evaluating Digital Libraries: A Longitudinal and Multifaceted View

    Get PDF
    published or submitted for publicatio

    A global approach to digital library evaluation towards quality interoperability

    Get PDF
    This paper describes some of the key research works related to my PhD thesis. The goal is the development of a global approach to digital library (DL) evaluation towards quality interoperability. DL evaluation has a vital role to play in building DLs, and in understanding and enhancing their role in society. Responding to two parallel research needs, the project is grouped around two tracks. Track one covers the theoretical approach, and provides an integrated evaluation model which overcomes the fragmentation of quality assessments; track two covers the experimental side, which has been undertaken through a comparative analysis of different DL evaluation methodologies, relating them to the conceptual framework. After presenting the problem dentition, current background and related work, this paper enumerates a set of research questions and hypotheses that I would like to address, and outlines the research methodology, focusing on a proposed evaluation framework and on the lessons learned from the case studies

    Digital service analysis and design : the role of process modelling

    Get PDF
    Digital libraries are evolving from content-centric systems to person-centric systems. Emergent services are interactive and multidimensional, associated systems multi-tiered and distributed. A holistic perspective is essential to their effective analysis and design, for beyond technical considerations, there are complex social, economic, organisational, and ergonomic requirements and relationships to consider. Such a perspective cannot be gained without direct user involvement, yet evidence suggests that development teams may be failing to effectively engage with users, relying on requirements derived from anecdotal evidence or prior experience. In such instances, there is a risk that services might be well designed, but functionally useless. This paper highlights the role of process modelling in gaining such perspective. Process modelling challenges, approaches, and success factors are considered, discussed with reference to a recent evaluation of usability and usefulness of a UK National Health Service (NHS) digital library. Reflecting on lessons learnt, recommendations are made regarding appropriate process modelling approach and application

    Evaluating complex digital resources

    Get PDF
    Squires (1999) discussed the gap between HCI (Human Computer Interaction) and the educational computing communities in their very different approaches to evaluating educational software. This paper revisits that issue in the context of evaluating digital resources, focusing on two approaches to evaluation: an HCI and an educational perspective. Squires and Preece's HCI evaluation model is a predictive model ‐ it helps teachers decide whether or not to use educational software ‐ whilst our own concern is in evaluating the use of learning technologies. It is suggested that in part the different approaches of the two communities relate to the different focus that each takes: in HCI the focus is typically on development and hence usability, whilst in education the concern is with the learner and teacher use
    • 

    corecore