134,932 research outputs found

    Digital Preservation, Archival Science and Methodological Foundations for Digital Libraries

    Get PDF
    Digital libraries, whether commercial, public or personal, lie at the heart of the information society. Yet, research into their longā€term viability and the meaningful accessibility of their contents remains in its infancy. In general, as we have pointed out elsewhere, ā€˜after more than twenty years of research in digital curation and preservation the actual theories, methods and technologies that can either foster or ensure digital longevity remain startlingly limited.ā€™ Research led by DigitalPreservationEurope (DPE) and the Digital Preservation Cluster of DELOS has allowed us to refine the key research challenges ā€“ theoretical, methodological and technological ā€“ that need attention by researchers in digital libraries during the coming five to ten years, if we are to ensure that the materials held in our emerging digital libraries are to remain sustainable, authentic, accessible and understandable over time. Building on this work and taking the theoretical framework of archival science as bedrock, this paper investigates digital preservation and its foundational role if digital libraries are to have longā€term viability at the centre of the global information society.

    Leadership of healthcare commissioning networks in England : a mixed-methods study on clinical commissioning groups

    Get PDF
    Objective: To explore the relational challenges for general practitioner (GP) leaders setting up new network-centric commissioning organisations in the recent health policy reform in England, we use innovation network theory to identify key network leadership practices that facilitate healthcare innovation. Design: Mixed-method, multisite and case study research. Setting: Six clinical commissioning groups and local clusters in the East of England area, covering in total 208 GPs and 1ā€…662ā€…000 population. Methods: Semistructured interviews with 56 lead GPs, practice managers and staff from the local health authorities (primary care trusts, PCT) as well as various healthcare professionals; 21 observations of clinical commissioning group (CCG) board and executive meetings; electronic survey of 58 CCG board members (these included GPs, practice managers, PCT employees, nurses and patient representatives) and subsequent social network analysis. Main outcome measures: Collaborative relationships between CCG board members and stakeholders from their healthcare network; clarifying the role of GPs as network leaders; strengths and areas for development of CCGs. Results: Drawing upon innovation network theory provides unique insights of the CCG leadersā€™ activities in establishing best practices and introducing new clinical pathways. In this context we identified three network leadership roles: managing knowledge flows, managing network coherence and managing network stability. Knowledge sharing and effective collaboration among GPs enable network stability and the alignment of CCG objectives with those of the wider health system (network coherence). Even though activities varied between commissioning groups, collaborative initiatives were common. However, there was significant variation among CCGs around the level of engagement with providers, patients and local authorities. Locality (sub) groups played an important role because they linked commissioning decisions with patient needs and brought the leaders closer to frontline stakeholders. Conclusions: With the new commissioning arrangements, the leaders should seek to move away from dyadic and transactional relationships to a network structure, thereby emphasising on the emerging relational focus of their roles. Managing knowledge mobility, healthcare network coherence and network stability are the three clinical leadership processes that CCG leaders need to consider in coordinating their network and facilitating the development of good clinical commissioning decisions, best practices and innovative services. To successfully manage these processes, CCG leaders need to leverage the relational capabilities of their network as well as their clinical expertise to establish appropriate collaborations that may improve the healthcare services in England. Lack of local GP engagement adds uncertainty to the system and increases the risk of commissioning decisions being irrelevant and inefficient from patient and provider perspectives

    Privacy and Health Information Technology

    Get PDF
    The increased use of health information technology (health IT) is a common element of nearly every health reform proposal because it has the potential to decrease costs, improve health outcomes, coordinate care, and improve public health. However, it raises concerns about security and privacy of medical information. This paper examines some of the ā€œgapsā€ in privacy protections that arise out of the current federal health privacy standard, the Health Insurance Portability and Accountability (HIPAA) Privacy Rule, the main federal law which governs the use and disclosure of health information. Additionally, it puts forth a range of possible solutions, accompanied by arguments for and against each. The solutions provide some options for strengthening the current legal framework of privacy protections in order to build public trust in health IT and facilitate its use for health reform. The American Recovery and Reinvestment Act (ARRA) enacted in February 2009 includes a number of changes to HIPAA and its regulations, and those changes are clearly noted among the list of solutions (and ARRA is indicated in the Executive Summary and paper where the Act has a relevant provision)

    Sport Brands: Brand Relationships and Consumer Behavior

    Get PDF

    CamFlow: Managed Data-sharing for Cloud Services

    Full text link
    A model of cloud services is emerging whereby a few trusted providers manage the underlying hardware and communications whereas many companies build on this infrastructure to offer higher level, cloud-hosted PaaS services and/or SaaS applications. From the start, strong isolation between cloud tenants was seen to be of paramount importance, provided first by virtual machines (VM) and later by containers, which share the operating system (OS) kernel. Increasingly it is the case that applications also require facilities to effect isolation and protection of data managed by those applications. They also require flexible data sharing with other applications, often across the traditional cloud-isolation boundaries; for example, when government provides many related services for its citizens on a common platform. Similar considerations apply to the end-users of applications. But in particular, the incorporation of cloud services within `Internet of Things' architectures is driving the requirements for both protection and cross-application data sharing. These concerns relate to the management of data. Traditional access control is application and principal/role specific, applied at policy enforcement points, after which there is no subsequent control over where data flows; a crucial issue once data has left its owner's control by cloud-hosted applications and within cloud-services. Information Flow Control (IFC), in addition, offers system-wide, end-to-end, flow control based on the properties of the data. We discuss the potential of cloud-deployed IFC for enforcing owners' dataflow policy with regard to protection and sharing, as well as safeguarding against malicious or buggy software. In addition, the audit log associated with IFC provides transparency, giving configurable system-wide visibility over data flows. [...]Comment: 14 pages, 8 figure

    Legal Solutions in Health Reform: Privacy and Health Information Technology

    Get PDF
    Identifies gaps in the federal health privacy standard and proposes options for strengthening the legal framework for privacy protections in order to build public trust in health information technology. Presents arguments for and against each option

    S-FaaS: Trustworthy and Accountable Function-as-a-Service using Intel SGX

    Full text link
    Function-as-a-Service (FaaS) is a recent and already very popular paradigm in cloud computing. The function provider need only specify the function to be run, usually in a high-level language like JavaScript, and the service provider orchestrates all the necessary infrastructure and software stacks. The function provider is only billed for the actual computational resources used by the function invocation. Compared to previous cloud paradigms, FaaS requires significantly more fine-grained resource measurement mechanisms, e.g. to measure compute time and memory usage of a single function invocation with sub-second accuracy. Thanks to the short duration and stateless nature of functions, and the availability of multiple open-source frameworks, FaaS enables non-traditional service providers e.g. individuals or data centers with spare capacity. However, this exacerbates the challenge of ensuring that resource consumption is measured accurately and reported reliably. It also raises the issues of ensuring computation is done correctly and minimizing the amount of information leaked to service providers. To address these challenges, we introduce S-FaaS, the first architecture and implementation of FaaS to provide strong security and accountability guarantees backed by Intel SGX. To match the dynamic event-driven nature of FaaS, our design introduces a new key distribution enclave and a novel transitive attestation protocol. A core contribution of S-FaaS is our set of resource measurement mechanisms that securely measure compute time inside an enclave, and actual memory allocations. We have integrated S-FaaS into the popular OpenWhisk FaaS framework. We evaluate the security of our architecture, the accuracy of our resource measurement mechanisms, and the performance of our implementation, showing that our resource measurement mechanisms add less than 6.3% latency on standardized benchmarks

    Quality of Information in Mobile Crowdsensing: Survey and Research Challenges

    Full text link
    Smartphones have become the most pervasive devices in people's lives, and are clearly transforming the way we live and perceive technology. Today's smartphones benefit from almost ubiquitous Internet connectivity and come equipped with a plethora of inexpensive yet powerful embedded sensors, such as accelerometer, gyroscope, microphone, and camera. This unique combination has enabled revolutionary applications based on the mobile crowdsensing paradigm, such as real-time road traffic monitoring, air and noise pollution, crime control, and wildlife monitoring, just to name a few. Differently from prior sensing paradigms, humans are now the primary actors of the sensing process, since they become fundamental in retrieving reliable and up-to-date information about the event being monitored. As humans may behave unreliably or maliciously, assessing and guaranteeing Quality of Information (QoI) becomes more important than ever. In this paper, we provide a new framework for defining and enforcing the QoI in mobile crowdsensing, and analyze in depth the current state-of-the-art on the topic. We also outline novel research challenges, along with possible directions of future work.Comment: To appear in ACM Transactions on Sensor Networks (TOSN
    • ā€¦
    corecore