5,035 research outputs found

    Making sense: talking data management with researchers

    Get PDF
    Incremental is one of eight projects in the JISC Managing Research Data programme funded to identify institutional requirements for digital research data management and pilot relevant infrastructure. Our findings concur with those of other Managing Research Data projects, as well as with several previous studies. We found that many researchers: (i) organise their data in an ad hoc fashion, posing difficulties with retrieval and re-use; (ii) store their data on all kinds of media without always considering security and back-up; (iii) are positive about data sharing in principle though reluctant in practice; (iv) believe back-up is equivalent to preservation. <br></br><br></br> The key difference between our approach and that of other Managing Research Data projects is the type of infrastructure we are piloting. While the majority of these projects focus on developing technical solutions, we are focusing on the need for ‘soft’ infrastructure, such as one-to-one tailored support, training, and easy-to-find, concise guidance that breaks down some of the barriers information professionals have unintentionally built with their use of specialist terminology. <br></br><br></br> We are employing a bottom-up approach as we feel that to support the step-by-step development of sound research data management practices, you must first understand researchers’ needs and perspectives. Over the life of the project, Incremental staff will act as mediators, assisting researchers and local support staff to understand the data management requirements within which they are expect to work, and will determine how these can be addressed within research workflows and the existing technical infrastructure. <br></br> <br></br> Our primary goal is to build data management capacity within the Universities of Cambridge and Glasgow by raising awareness of basic principles so everyone can manage their data to a certain extent. We will ensure our lessons can be picked up and used by other institutions. Our affiliation with the Digital Curation Centre and Digital Preservation Coalition will assist in this and all outputs will be released under a Creative Commons licence. The key difference between our approach and that of other MRD projects is the type of ‘infrastructure’ we are piloting. While the majority of these projects focus on developing technical solutions, we are focusing on the need for ‘soft’ infrastructure, such as one-to-one tailored support, training, and easy-to-find, concise guidance that breaks down some of the barriers information professionals have unintentionally built with their use of specialist terminology. We are employing a bottom-up approach as we feel that to support the step-by-step development of sound research data management practices, you must first understand researchers’ needs and perspectives. Over the life of the project, Incremental staff will act as mediators, assisting researchers and local support staff to understand the data management requirements within which they are expect to work, and will determine how these can be addressed within research workflows and the existing technical infrastructure. Our primary goal is to build data management capacity within the Universities of Cambridge and Glasgow by raising awareness of basic principles so everyone can manage their data to a certain extent. We’re achieving this by: - re-positioning existing guidance so researchers can locate the advice they need; - connecting researchers with one-to-one advice, support and partnering; - offering practical training and a seminar series to address key data management topics. We will ensure our lessons can be picked up and used by other institutions. Our affiliation with the Digital Curation Centre and Digital Preservation Coalition will assist in this and all outputs will be released under a Creative Commons licence

    Digital Preservation, Archival Science and Methodological Foundations for Digital Libraries

    Get PDF
    Digital libraries, whether commercial, public or personal, lie at the heart of the information society. Yet, research into their long‐term viability and the meaningful accessibility of their contents remains in its infancy. In general, as we have pointed out elsewhere, ‘after more than twenty years of research in digital curation and preservation the actual theories, methods and technologies that can either foster or ensure digital longevity remain startlingly limited.’ Research led by DigitalPreservationEurope (DPE) and the Digital Preservation Cluster of DELOS has allowed us to refine the key research challenges – theoretical, methodological and technological – that need attention by researchers in digital libraries during the coming five to ten years, if we are to ensure that the materials held in our emerging digital libraries are to remain sustainable, authentic, accessible and understandable over time. Building on this work and taking the theoretical framework of archival science as bedrock, this paper investigates digital preservation and its foundational role if digital libraries are to have long‐term viability at the centre of the global information society.

    Research, relativity and relevance : can universal truths answer local questions

    Get PDF
    It is a commonplace that the internet has led to a globalisation of informatics and that this has had beneficial effects in terms of standards and interoperability. However this necessary harmonisation has also led to a growing understanding that this positive trend has an in-built assumption that "one size fits all". The paper explores the importance of local and national research in addressing global issues and the appropriateness of local solutions and applications. It concludes that federal and collegial solutions are to be preferred to imperial solutions

    Internet of Things data contextualisation for scalable information processing, security, and privacy

    Get PDF
    The Internet of Things (IoT) interconnects billions of sensors and other devices (i.e., things) via the internet, enabling novel services and products that are becoming increasingly important for industry, government, education and society in general. It is estimated that by 2025, the number of IoT devices will exceed 50 billion, which is seven times the estimated human population at that time. With such a tremendous increase in the number of IoT devices, the data they generate is also increasing exponentially and needs to be analysed and secured more efficiently. This gives rise to what is appearing to be the most significant challenge for the IoT: Novel, scalable solutions are required to analyse and secure the extraordinary amount of data generated by tens of billions of IoT devices. Currently, no solutions exist in the literature that provide scalable and secure IoT scale data processing. In this thesis, a novel scalable approach is proposed for processing and securing IoT scale data, which we refer to as contextualisation. The contextualisation solution aims to exclude irrelevant IoT data from processing and address data analysis and security considerations via the use of contextual information. More specifically, contextualisation can effectively reduce the volume, velocity and variety of data that needs to be processed and secured in IoT applications. This contextualisation-based data reduction can subsequently provide IoT applications with the scalability needed for IoT scale knowledge extraction and information security. IoT scale applications, such as smart parking or smart healthcare systems, can benefit from the proposed method, which  improves the scalability of data processing as well as the security and privacy of data.   The main contributions of this thesis are: 1) An introduction to context and contextualisation for IoT applications; 2) a contextualisation methodology for IoT-based applications that is modelled around observation, orientation, decision and action loops; 3) a collection of contextualisation techniques and a corresponding software platform for IoT data processing (referred to as contextualisation-as-a-service or ConTaaS) that enables highly scalable data analysis, security and privacy solutions; and 4) an evaluation of ConTaaS in several IoT applications to demonstrate that our contextualisation techniques permit data analysis, security and privacy solutions to remain linear, even in situations where the number of IoT data points increases exponentially

    Educating the effective digital forensics practitioner: academic, professional, graduate and student perspectives

    Get PDF
    Over the years, digital forensics has become an important and sought-after profession where the gateway of training and education has developed vastly over the past decade. Many UK higher education (HE) institutions now deliver courses that prepare students for careers in digital forensics and, in most recent advances, cyber security. Skills shortages and external influences attributed within the field of cyber security, and its relationship as a discipline with digital forensics, has shifted the dynamic of UK higher education provisions. The implications of this now sees the route to becoming a digital forensic practitioner, be it in law enforcement or business, transform from on-the-job training to university educated, trained analysts. This thesis examined courses within HE and discovered that the delivery of these courses often overlooked areas such as mobile forensics, live data forensics, Linux and Mac knowledge. This research also considered current standards available across HE to understand whether educational programmes are delivering what is documented as relevant curriculum. Cyber security was found to be the central focus of these standards within inclusion of digital forensics, adding further to the debate and lack of distinctive nature of digital forensics as its own discipline. Few standards demonstrated how the topics, knowledge, skills and competences drawn were identified as relevant and effective for producing digital forensic practitioners. Additionally, this thesis analyses and discusses results from 201 participants across five stakeholder groups: graduates, professionals, academics, students and the public. These areas were selected due to being underdeveloped in existing literature and the crucial role they play in the cycle of producing effective practitioners. Analysis on stakeholder views, experiences and thoughts surrounding education and training offer unique insight, theoretical underpinnings and original contributions not seen in existing literature. For example, challenges, costs and initial issues with introducing graduates to employment for the employers and/or supervising practitioners, the lack of awareness and contextualisation on behalf of students and graduates towards what knowledge and skills they have learned and acquired on a course and its practical application on-the-job which often lead to suggestions of a lack of fundamental knowledge and skills. This is evidenced throughout the thesis, but examples include graduates: for their reflections on education based on their new on-the-job experiences and practices; professionals: for their job experiences and requirements, academics: for their educational practices and challenges; students: their initial expectations and views; and, the public: for their general understanding. This research uniquely captures these perspectives, bolstering the development of digital forensics as an academic discipline, along with the importance these diverse views play in the overall approach to delivering skilled practitioners. While the main contribution to knowledge within this thesis is its narrative focusing on the education of effective digital forensic practitioners and its major stakeholders, this thesis also makes additional contributions both academically and professionally; including the discussion, analysis and reflection of: - improvements for education and digital forensics topics for research and curriculum development; - where course offerings can be improved for institutions offering digital forensic degree programmes; - the need for further collaboration between industry and academia to provide students and graduates with greater understanding of the real-life role of a digital forensic practitioner and the expectations in employment; - continuous and unique challenges within both academia and the industry which digital forensics possess and the need for improved facilities and tool development to curate and share problem and scenario-based learning studies

    The 1935 Hsinchu-Taichung Earthquake

    Get PDF
    The history of natural disasters in Taiwan has frequently been linked to the practice of historical preservation, archival science, oral history, and museum curatorship. All are collectively hallmarks of a broad range of activities that fall under the umbrella of public history. The problem for Taiwan, however, concerns the legitimacy. Taiwan does not have a single national narrative. It has been subjected to waves of colonialism since the seventeenth century and does not presently have a fully post-colonial narrative. The earthquakes discussed in this paper occurred in two different periods of colonisation.  In order to situate the history of earthquakes into a public history discourse, the field of earthquake-based research in Taiwan has to incorporate different audiences and integrate into a much broader understanding. By this, I mean that the present regimental academic disciplines in Taiwan need to be cross disciplinary, especially since public history is by its very nature collaborative. It illuminates a shared authority over a much wider area. It needs to. It is my argument that it is in digital humanities that Taiwanese academics work best in collaboration. Efforts have been made to digitise the personal experiences of those involved in typhoon reconstruction efforts. A natural synergy, therefore, for the understanding of earthquakes, as public history, is to emphasise access and broad participation in the creation of knowledge. Digital humanities enables this. Attention to this is particularly important in historical preservation of particular sites on an island that frequently develops and re-develops brownfield sites

    Ecclesiastical museums and the pontifical letter on its pastoral functions

    Get PDF
    The Catholic Church arrogates a long tradition of protecting and using heritage to complement its evangelisation ministry from the medieval ecclesiastical treasures included in museology proto-history. While these treasures have adopted museographic features, other typologies of ecclesiastical museums have appeared, demanding regulations that could orient their activities. After the Second Vatican Council, the Church became increasingly focused on guaranteeing a worthy destination for the objects left over from worship. In 2001, the Pontifical Commission for the Cultural Heritage of the Church published the Circular Letter The pastoral function of ecclesiastical museums, establishing that the ecclesiastical museum is an adequate solution for these objects, keeping them close to the cultural group of origin and providing continuity to its original catechetical function. Two decades later, a critical analysis of the Letter is proposed in the theoretical frame of museum studies. Considering the recovery object’s original meaning in the museum discourse, the connection to territory, and the interaction with the plural and heterogeneous audience, the conformity of the Letter with the museum theory is underlined. With a focus on its general accuracy, the aim of this study is to evaluate how the Letter remains actualised and adapted to contemporaneity in addition to the challenges and transformations now faced by museums.info:eu-repo/semantics/publishedVersio

    Interoperability for digital repositories: towards a policy and quality framework

    Get PDF
    Interoperability is a property referring to the ability of diverse systems and organisations to work together. Today interoperability is considered a key-step to move from isolated digital repositories towards a common information space that allow users to browse through different resources within a single integrated environment. In this conference we describe the multi-level challenges that digital repositories face towards policy and quality interoperability, presenting the approaches and the interim outcomes of the Policy and Quality Working Groups within the EU-funded project DL.org (http://www.dlorg.eu/)

    Findings from the Workshop on User-Centered Design of Language Archives

    Get PDF
    This white paper describes findings from the workshop on User-Centered Design of Language Archives organized in February 2016 by Christina Wasson (University of North Texas) and Gary Holton (University of Hawai‘i at Mānoa). It reviews relevant aspects of language archiving and user-centered design to construct the rationale for the workshop, relates key insights produced during the workshop, and outlines next steps in the larger research trajectory initiated by this workshop. The purpose of this white paper is to make all of the findings from the workshop publicly available in a short time frame, and without the constraints of a journal article concerning length, audience, format, and so forth. Selections from this white paper will be used in subsequent journal articles. So much was learned during the workshop; we wanted to provide a thorough documentation to ensure that none of the key insights would be lost. We consider this document a white paper because it provides the foundational insights and initial conceptual frameworks that will guide us in our further research on the user-centered design of language archives. We hope this report will be useful to members of all stakeholder groups seeking to develop user-centered designs for language archives.U.S. National Science Foundation Documenting Endangered Languages Program grants BCS-1543763 and BCS-1543828
    • 

    corecore