1,755 research outputs found

    The Data Audit Framework: a toolkit to identify research assets and improve data management in research led institutions

    Get PDF
    Although vast quantities of data are being created within higher education, few institutions have formal strategies in place for curating these research outputs in the longterm. Moreover there appears to be a lack of awareness as to exactly what data are held and whether they are being managed. In response to these concerns the Joint Information Systems Committee (JISC) issued a call for proposals to develop and implement a Data Audit Framework suited to the needs of the UK higher education research communities. The Data Audit Framework (DAF) Development project was funded to produce an audit methodology, online toolkit, and a registry. Four additional implementation projects were funded to test the toolkit and promote its uptake. This paper outlines the audit methodology, introduces the online toolkit, and provides feedback on implementing the Data Audit Framework.

    Identity in research infrastructure and scientific communication: Report from the 1st IRISC workshop, Helsinki Sep 12-13, 2011

    Get PDF
    Motivation for the IRISC workshop came from the observation that identity and digital identification are increasingly important factors in modern scientific research, especially with the now near-ubiquitous use of the Internet as a global medium for dissemination and debate of scientific knowledge and data, and as a platform for scientific collaborations and large-scale e-science activities.

The 1 1/2 day IRISC2011 workshop sought to explore a series of interrelated topics under two main themes: i) unambiguously identifying authors/creators & attributing their scholarly works, and ii) individual identification and access management in the context of identity federations. Specific aims of the workshop included:

• Raising overall awareness of key technical and non-technical challenges, opportunities and developments.
• Facilitating a dialogue, cross-pollination of ideas, collaboration and coordination between diverse – and largely unconnected – communities.
• Identifying & discussing existing/emerging technologies, best practices and requirements for researcher identification.

This report provides background information on key identification-related concepts & projects, describes workshop proceedings and summarizes key workshop findings

    DBKnot: A Transparent and Seamless, Pluggable Tamper Evident Database

    Get PDF
    Database integrity is crucial to organizations that rely on databases of important data. They suffer from the vulnerability to internal fraud. Database tampering by internal malicious employees with high technical authorization to their infrastructure or even compromised by externals is one of the important attack vectors. This thesis addresses such challenge in a class of problems where data is appended only and is immutable. Examples of operations where data does not change is a) financial institutions (banks, accounting systems, stock market, etc., b) registries and notary systems where important data is kept but is never subject to change, and c) system logs that must be kept intact for performance and forensic inspection if needed. The target of the approach is implementation seamlessness with little-or-no changes required in existing systems. Transaction tracking for tamper detection is done by utilizing a common hashtable that serially and cumulatively hashes transactions together while using an external time-stamper and signer to sign such linkages together. This allows transactions to be tracked without any of the organizations’ data leaving their premises and going to any third-party which also reduces the performance impact of tracking. This is done so by adding a tracking layer and embedding it inside the data workflow while keeping it as un-invasive as possible. DBKnot implements such features a) natively into databases, or b) embedded inside Object Relational Mapping (ORM) frameworks, and finally c) outlines a direction of implementing it as a stand-alone microservice reverse-proxy. A prototype ORM and database layer has been developed and tested for seamlessness of integration and ease of use. Additionally, different models of optimization by implementing pipelining parallelism in the hashing/signing process have been tested in order to check their impact on performance. Stock-market information was used for experimentation with DBKnot and the initial results gave a slightly less than 100% increase in transaction time by using the most basic, sequential, and synchronous version of DBKnot. Signing and hashing overhead does not show significant increase per record with the increased amount of data. A number of different alternate optimizations were done to the design that via testing have resulted in significant increase in performance

    Neurosurgery specialty training in the UK: What you need to know to be shortlisted for an interview

    Get PDF
    Neurosurgery is one of the most competitive specialties in the UK. In 2019, securing an ST1 post in neurosurgery corresponds to competition ration of 6.54 whereas a CST1 post 2.93. Further, at ST3 level, neurosurgery is the most competitive. In addition, the number of neurosurgical training posts are likely to be reduced in the coming years. A number of very specific shortlisting criteria, aiming to filter and select the best candidates for interview exist. In the context of the high competition ratios and the specific shortlisting criteria, developing an interest in the neurosciences early on will allow individuals more time to meet the necessary standards for neurosurgery. Here, we aim to outline the shortlisting criteria and offer advice on how to achieve maximum scores, increasing the likelihood to be shortlisted for an interview

    Secure big data ecosystem architecture : challenges and solutions

    Get PDF
    Big data ecosystems are complex data-intensive, digital–physical systems. Data-intensive ecosystems offer a number of benefits; however, they present challenges as well. One major challenge is related to the privacy and security. A number of privacy and security models, techniques and algorithms have been proposed over a period of time. The limitation is that these solutions are primarily focused on an individual or on an isolated organizational context. There is a need to study and provide complete end-to-end solutions that ensure security and privacy throughout the data lifecycle across the ecosystem beyond the boundary of an individual system or organizational context. The results of current study provide a review of the existing privacy and security challenges and solutions using the systematic literature review (SLR) approach. Based on the SLR approach, 79 applicable articles were selected and analyzed. The information from these articles was extracted to compile a catalogue of security and privacy challenges in big data ecosystems and to highlight their interdependencies. The results were categorized from theoretical viewpoint using adaptive enterprise architecture and practical viewpoint using DAMA framework as guiding lens. The findings of this research will help to identify the research gaps and draw novel research directions in the context of privacy and security in big data-intensive ecosystems. © 2021, The Author(s)

    A Canonical Form for PROV Documents and its Application to Equality, Signature, and Validation

    Get PDF

    Swansea University : institutional review

    Get PDF

    Handbook for institutional audit: England and Northern Ireland 2006

    Get PDF
    corecore