202 research outputs found

    Briefing Paper: Article Processing Charges

    Get PDF

    JISC Preservation of Web Resources (PoWR) Handbook

    Get PDF
    Handbook of Web Preservation produced by the JISC-PoWR project which ran from April to November 2008. The handbook specifically addresses digital preservation issues that are relevant to the UK HE/FE web management community”. The project was undertaken jointly by UKOLN at the University of Bath and ULCC Digital Archives department

    Preservation of Web Resources: The JISC PoWR Project

    Get PDF
    This paper describes the work of the JISC-funded PoWR (Preservation Of Web Resources) project which is developing a handbook on best practices and advice aimed at UK higher and further educational institutions for the preservation of Web sites and Web resources. The paper summarises the challenges institutions face in preserving Web resources, describes the workshops organized by the project in order to identify the challenges and identify appropriate best practices, and outlines areas in which further work is required

    Final Review of the Exploit Interactive Electronic Magazine

    Get PDF

    Final Review of the Exploit Interactive Electronic Magazine

    Get PDF

    Delivering Innovative RDM Training: The immersiveInformatics Pilot Programme

    Get PDF
    This paper presents the findings, lessons learned and next steps associated with the implementation of the immersiveInformatics pilot: a distinctive research data management (RDM) training programme designed in collaboration between UKOLN Informatics and the Library at the University of Melbourne, Australia. The pilot aimed to equip a broad range of academic and professional staff roles with RDM skills as a key element of capacity and capability building within a single institution.</jats:p

    Leading from the library: data management initiatives at the University of Northampton

    Get PDF
    This paper reflects on data management initiatives at the University of Northampton and the central role of the library in pushing forward this work. Over the last two years the University has implemented several tools provided by the UK Digital Curation Centre (DCC) and is now collaborating directly with the DCC to implement its new research data policy and train librarians to provide research data management (RDM) support. The DCC is engaging with several universities in this way so sets the example of the University of Northampton in a wider UK contex

    D2.2.2 Final Version of the LinkedUp Evaluation Framework

    Get PDF
    This document (D2.2.2) describes the LinkedUp consortium’s experience in developing and on- going improvement of the LinkedUp Evaluation Framework throughout three web open educational data competitions: Veni, Vidi, Vici. D2.2.2 is the final report regarding the Evaluation Framework (EF). It synthesises the work already done in the previous WP2 deliverables (D2.1, D2.2.1, D2.3.1, D2.3.2, D2.3.3) reporting on best practices, providing suggestions for improvements and possible adjustments to additional application areas. The initial version of the EF was developed by applying the Group Concept Mapping Methodology (GCM). It objectively identified through some advanced statistical techniques the shared vision of experts in the domain of technology-enhanced learning on the criteria and indicators of the EF. The GCM contributed to the construct and content validity of the EF. The first version of the EF was tested during the Learning Analytics and Knowledge Conference 2013 (LAK 13). After each competition round (Veni, Vidi, Vici) usefulness and ease of use of the EF were tested with a number of experts through a questionnaire and interviews. The analysis of the data suggested some improvements. In this final report of the EF we summarise the lessons-learned and provide six main suggestions for future data competitions developers: 1. Designing a data competition starts with a definition of evaluation criteria 2. Test the understandability of your evaluation criteria before publishing those 3. Do not use an ‘not applicable’ option for evaluation indicators 4. Less (indicators) are more (preferable) 5. Apply an unification of the scale of evaluation indicators’ 6. Weighting of important evaluation criteria can be very informative We finally present the final version of the LinkedUp EF and refer to the LinkedUp toolbox that provides all lessons-learned and further information for future data competition organisers.LinkedU
    corecore