3,577 research outputs found

    Metadata quality issues in learning repositories

    Get PDF
    Metadata lies at the heart of every digital repository project in the sense that it defines and drives the description of digital content stored in the repositories. Metadata allows content to be successfully stored, managed and retrieved but also preserved in the long-term. Despite the enormous importance of metadata in digital repositories, one that is widely recognized, studies indicate that what is defined as metadata quality, is relatively low in most cases of digital repositories. Metadata quality is loosely defined as "fitness for purpose" meaning that low quality of metadata means that metadata cannot fulfill its purpose which is to allow for the successful storage, management and retrieval of resources. In practice, low metadata quality leads to ineffective searches for content, ones that recall the wrong resources or even worse, no resources which makes them invisible to the intended user, that is the "client" of each digital repository. The present dissertation approaches this problem by proposing a comprehensive metadata quality assurance method, namely the Metadata Quality Assurance Certification Process (MQACP). The basic idea of this dissertation is to propose a set of methods that can be deployed throughout the lifecycle of a repository to ensure that metadata generated from content providers are of high quality. These methods have to be straightforward, simple to apply with measurable results. They also have to be adaptable with minimum effort so that they can be used in different contexts easily. This set of methods was described analytically, taking into account the actors needed to apply them, describing the tools needed and defining the anticipated outcomes. In order to test our proposal, we applied it on a Learning Federation of repositories, from day 1 of its existence until it reached its maturity and regular operation. We supported the metadata creation process throughout the different phases of the repositories involved by setting up specific experiments using the methods and tools of the MQACP. Throughout each phase, we measured the resulting metadata quality to certify that the anticipated improvement in metadata quality actually took place. Lastly, through these different phases, the cost of the MQACP application was measured to provide a comparison basis for future applications. Based on the success of this first application, we decided to validate the MQACP approach by applying it on another two cases of a Cultural and a Research Federation of repositories. This would allow us to prove the transferability of the approach to other cases the present some similarities with the initial one but mainly significant differences. The results showed that the MQACP was successfully adapted to the new contexts, with minimum adaptations needed, with similar results produced and also with comparable costs. In addition, looking closer at the common experiments carried out in each phase of each use case, we were able to identify interesting patterns in the behavior of content providers that can be further researched. The dissertation is completed with a set of future research directions that came out of the cases examined. These research directions can be explored in order to support the next version of the MQACP in terms of the methods deployed, the tools used to assess metadata quality as well as the cost analysis of the MQACP methods

    Metadata quality issues in learning repositories

    Get PDF
    Metadata lies at the heart of every digital repository project in the sense that it defines and drives the description of digital content stored in the repositories. Metadata allows content to be successfully stored, managed and retrieved but also preserved in the long-term. Despite the enormous importance of metadata in digital repositories, one that is widely recognized, studies indicate that what is defined as metadata quality, is relatively low in most cases of digital repositories. Metadata quality is loosely defined as "fitness for purpose" meaning that low quality of metadata means that metadata cannot fulfill its purpose which is to allow for the successful storage, management and retrieval of resources. In practice, low metadata quality leads to ineffective searches for content, ones that recall the wrong resources or even worse, no resources which makes them invisible to the intended user, that is the "client" of each digital repository. The present dissertation approaches this problem by proposing a comprehensive metadata quality assurance method, namely the Metadata Quality Assurance Certification Process (MQACP). The basic idea of this dissertation is to propose a set of methods that can be deployed throughout the lifecycle of a repository to ensure that metadata generated from content providers are of high quality. These methods have to be straightforward, simple to apply with measurable results. They also have to be adaptable with minimum effort so that they can be used in different contexts easily. This set of methods was described analytically, taking into account the actors needed to apply them, describing the tools needed and defining the anticipated outcomes. In order to test our proposal, we applied it on a Learning Federation of repositories, from day 1 of its existence until it reached its maturity and regular operation. We supported the metadata creation process throughout the different phases of the repositories involved by setting up specific experiments using the methods and tools of the MQACP. Throughout each phase, we measured the resulting metadata quality to certify that the anticipated improvement in metadata quality actually took place. Lastly, through these different phases, the cost of the MQACP application was measured to provide a comparison basis for future applications. Based on the success of this first application, we decided to validate the MQACP approach by applying it on another two cases of a Cultural and a Research Federation of repositories. This would allow us to prove the transferability of the approach to other cases the present some similarities with the initial one but mainly significant differences. The results showed that the MQACP was successfully adapted to the new contexts, with minimum adaptations needed, with similar results produced and also with comparable costs. In addition, looking closer at the common experiments carried out in each phase of each use case, we were able to identify interesting patterns in the behavior of content providers that can be further researched. The dissertation is completed with a set of future research directions that came out of the cases examined. These research directions can be explored in order to support the next version of the MQACP in terms of the methods deployed, the tools used to assess metadata quality as well as the cost analysis of the MQACP methods

    ERAWATCH Country Reports 2013: Bulgaria

    Get PDF
    The Analytical Country Reports analyse and assess in a structured manner the evolution of the national policy research and innovation in the perspective of the wider EU strategy and goals, with a particular focus on the performance of the national research and innovation (R&I) system, their broader policy mix and governance. The 2013 edition of the Country Reports highlight national policy and system developments occurring since late 2012 and assess, through dedicated sections: -National progress in addressing Research and Innovation system challenges; -National progress in addressing the 5 ERA priorities; -The progress at Member State level towards achieving the Innovation Union; -The status and relevant features of Regional and/or National Research and Innovation Strategies on Smart Specialisation (RIS3); -As far relevant, country Specific Research and Innovation (R&I) Recommendations. Detailed annexes in tabular form provide access to country information in a concise and synthetic manner. The reports were originally produced in December 2013, focusing on policy developments occurring over the preceding twelve months.JRC.J.2-Knowledge for Growt

    Contexts and Contributions: Building the Distributed Library

    Get PDF
    This report updates and expands on A Survey of Digital Library Aggregation Services, originally commissioned by the DLF as an internal report in summer 2003, and released to the public later that year. It highlights major developments affecting the ecosystem of scholarly communications and digital libraries since the last survey and provides an analysis of OAI implementation demographics, based on a comparative review of repository registries and cross-archive search services. Secondly, it reviews the state-of-practice for a cohort of digital library aggregation services, grouping them in the context of the problem space to which they most closely adhere. Based in part on responses collected in fall 2005 from an online survey distributed to the original core services, the report investigates the purpose, function and challenges of next-generation aggregation services. On a case-by-case basis, the advances in each service are of interest in isolation from each other, but the report also attempts to situate these services in a larger context and to understand how they fit into a multi-dimensional and interdependent ecosystem supporting the worldwide community of scholars. Finally, the report summarizes the contributions of these services thus far and identifies obstacles requiring further attention to realize the goal of an open, distributed digital library system

    Proceedings of the 16th IFLA ILDS conference

    Get PDF

    Open educational resources : conversations in cyberspace

    Get PDF
    172 p. : ill. ; 25 cm.Libro ElectrónicoEducation systems today face two major challenges: expanding the reach of education and improving its quality. Traditional solutions will not suffice, especially in the context of today's knowledge-intensive societies. The Open Educational Resources movement offers one solution for extending the reach of education and expanding learning opportunities. The goal of the movement is to equalize access to knowledge worldwide through openly and freely available online high-quality content. Over the course of two years, the international community came together in a series of online discussion forums to discuss the concept of Open Educational Resources and its potential. This publication makes the background papers and reports from those discussions available in print.--Publisher's description.A first forum : presenting the open educational resources (OER) movement. Open educational resources : an introductory note / Sally Johnstone -- Providing OER and related issues : an introductory note / Anne Margulies, ... [et al.] -- Using OER and related issues : in introductory note / Mohammed-Nabil Sabry, ... [et al.] -- Discussion highlights / Paul Albright -- Ongoing discussion. A research agenda for OER : discussion highlights / Kim Tucker and Peter Bateman -- A 'do-it-yourself' resource for OER : discussion highlights / Boris Vukovic -- Free and open source software (FOSS) and OER -- A second forum : discussing the OECD study of OER. Mapping procedures and users / Jan Hylén -- Why individuals and institutions share and use OER / Jan Hylén -- Discussion highlights / Alexa Joyce -- Priorities for action. Open educational resources : the way forward / Susan D'Antoni

    Gamification and blended learning in vocational training and coaching in short courses

    Get PDF
    Vocational training and coaching delivered through short courses and workshops, needs to be as effective as possible due to several reasons: learners (and their employers) invest their time and energy, they expect direct usefulness of the course content and roadmap on how to implement the knowledge into real-life problems, there is certain extent of professional maturity that expects proper balance of theoretical and practical aspects and the outcomes are evident in short time for the benefit of all, or for the worse. These reasons motivated the trainers to use a specific instructional design that embodies gamification, blended learning and coaching for vocational training in short courses for group of participants from Western Balkans countries, with diverse backgrounds, languages, professions, education, personal and collective goals on the topic of preparing successful project proposals for funding, supported by the Western Balkans Alumni Association projects. The specifics in this instructional design have been multifold, compared to traditional training and coaching. One of the novel aspects is the combination of trainers related to the course content of projects funding from various aspects - National Agency expertise, academic, governmental and non-governmental experience, project evaluation and business practitioners. This setting of training facilitated by representative of almost every stakeholder in the higher educational ecosystem and articulated through the curricula of the trainers is aimed to enable overall knowledge apprehension, networking and immediate feedback loops. The incorporation of virtual activities before, during and after the in-person course utilises blended learning mashup of various tools such as e-learning platform for asynchronous communication, social networks for synchronous communication, webinar tool for virtual presence, video tutorials for future reference and expanded dissemination and personal contact for coaching. Another novelty in this approach is in the gamification aspect - introducing scout-game in the forest, using broad range of symbolic matrices - of compasses, maps, planned activities and unplanned events, training of skills and situational awareness, progress mechanics, challenges to achieve teamwork towards outputs and outcomes. And last but not least, the balance of theoretical and practical aspects is achieved by inviting each participant to have own working example and apply the concepts while receiving immediate feedback, or working on collective example - for all of which, follow up coaching is provided. The ‘magic’ of effective project proposals is complemented with appropriate change management and tactical management. The cognitive and knowledge dimension categories and components have been addressed in their entirety through the extended Bloom taxonomy and the evaluation has been made in formative and summative manner. The learning experience with our instructional design in this pilot instance happens for each participant - on the side of the learners and on the side of the trainers and other HEI stakeholders, including the WBAA - with specific goal to enable emergent effects of networked learners that can put their knowledge, skills and competences in right direction and produce primary and secondary effects for the Western Balkans region and EU

    The Bari Manifesto : An interoperability framework for essential biodiversity variables

    Get PDF
    Essential Biodiversity Variables (EBV) are fundamental variables that can be used for assessing biodiversity change over time, for determining adherence to biodiversity policy, for monitoring progress towards sustainable development goals, and for tracking biodiversity responses to disturbances and management interventions. Data from observations or models that provide measured or estimated EBV values, which we refer to as EBV data products, can help to capture the above processes and trends and can serve as a coherent framework for documenting trends in biodiversity. Using primary biodiversity records and other raw data as sources to produce EBV data products depends on cooperation and interoperability among multiple stakeholders, including those collecting and mobilising data for EBVs and those producing, publishing and preserving EBV data products. Here, we encapsulate ten principles for the current best practice in EBV-focused biodiversity informatics as 'The Bari Manifesto', serving as implementation guidelines for data and research infrastructure providers to support the emerging EBV operational framework based on trans-national and cross-infrastructure scientific workflows. The principles provide guidance on how to contribute towards the production of EBV data products that are globally oriented, while remaining appropriate to the producer's own mission, vision and goals. These ten principles cover: data management planning; data structure; metadata; services; data quality; workflows; provenance; ontologies/vocabularies; data preservation; and accessibility. For each principle, desired outcomes and goals have been formulated. Some specific actions related to fulfilling the Bari Manifesto principles are highlighted in the context of each of four groups of organizations contributing to enabling data interoperability - data standards bodies, research data infrastructures, the pertinent research communities, and funders. The Bari Manifesto provides a roadmap enabling support for routine generation of EBV data products, and increases the likelihood of success for a global EBV framework.Peer reviewe
    corecore