351 research outputs found

    Active data-centric framework for data protection in cloud environment

    Get PDF
    Cloud computing is an emerging evolutionary computing model that provides highly scalable services over highspeed Internet on a pay-as-usage model. However, cloud-based solutions still have not been widely deployed in some sensitive areas, such as banking and healthcare. The lack of widespread development is related to users&rsquo; concern that their confidential data or privacy would leak out in the cloud&rsquo;s outsourced environment. To address this problem, we propose a novel active data-centric framework to ultimately improve the transparency and accountability of actual usage of the users&rsquo; data in cloud. Our data-centric framework emphasizes &ldquo;active&rdquo; feature which packages the raw data with active properties that enforce data usage with active defending and protection capability. To achieve the active scheme, we devise the Triggerable Data File Structure (TDFS). Moreover, we employ the zero-knowledge proof scheme to verify the request&rsquo;s identification without revealing any vital information. Our experimental outcomes demonstrate the efficiency, dependability, and scalability of our framework.<br /

    Transparent Personal Data Processing: The Road Ahead

    Get PDF
    The European General Data Protection Regulation defines a set of obligations for personal data controllers and processors. Primary obligations include: obtaining explicit consent from the data subject for the processing of personal data, providing full transparency with respect to the processing, and enabling data rectification and erasure (albeit only in certain circumstances). At the core of any transparency architecture is the logging of events in relation to the processing and sharing of personal data. The logs should enable verification that data processors abide by the access and usage control policies that have been associated with the data based on the data subject's consent and the applicable regulations. In this position paper, we: (i) identify the requirements that need to be satisfied by such a transparency architecture, (ii) examine the suitability of existing logging mechanisms in light of said requirements, and (iii) present a number of open challenges and opportunities

    Cloud technology options towards Free Flow of Data

    Get PDF
    This whitepaper collects the technology solutions that the projects in the Data Protection, Security and Privacy Cluster propose to address the challenges raised by the working areas of the Free Flow of Data initiative. The document describes the technologies, methodologies, models, and tools researched and developed by the clustered projects mapped to the ten areas of work of the Free Flow of Data initiative. The aim is to facilitate the identification of the state-of-the-art of technology options towards solving the data security and privacy challenges posed by the Free Flow of Data initiative in Europe. The document gives reference to the Cluster, the individual projects and the technologies produced by them

    Middleware to support accountability of business to business interactions

    Get PDF
    PhD ThesisEnabling technologies have driven standardisation efforts specifying B2B interactions between organisations including the information to be exchanged and its associated business level requirements. These interactions are encoded as conversations to which organisations agree and execute. It is pivotal to continued cooperation with these interactions that their regulation be supported; minimally, that all actions taken are held accountable and no participant is placed at a disadvantage having remained compliant. Technical protocols exist to support regulation (e.g., provide fairness and accountability). However, such protocols incur expertise, infrastructure and integration requirements, possibly diverting an organisation’s attention from fulfilling obligations to interactions in which they are involved. Guarantees provided by these protocols can be paired with functional properties, declaratively describing the support they provide. By encapsulating properties and protocols in intermediaries through which messages are routed, expertise, infrastructure and integration requirements can be alleviated from interacting organisations while their interactions are transparently provided with additional support. Previous work focused on supporting individual issues without tackling concerns of asynchronicity, transparency and loose coupling. This thesis develops on previous work by designing generalised intermediary middleware capable of intercepting messages and transparently satisfying supportive properties. By enforcing loose coupling and transparency, all interactions may be provided with additional support without modification, independent of the higher level (i.e., B2B) standards in use and existing work may be expressed as instances of the proposed generalised design. This support will be provided at lower levels, justified by a survey of B2B and messaging standards. Proof of concept implementations will demonstrate the suitability of the approach. The work will demonstrate that providing transparent, decoupled support at lower levels of abstraction is useful and can be applied to domains beyond B2B and message oriented interactions.EPSRC Hat’s Newcastle operation Dr. Mark Littl

    Data trust framework using blockchain and smart contracts

    Get PDF
    Lack of trust is the main barrier preventing more widespread data sharing. The lack of transparent and reliable infrastructure for data sharing prevents many data owners from sharing their data. Data trust is a paradigm that facilitates data sharing by forcing data controllers to be transparent about the process of sharing and reusing data. Blockchain technology has the potential to present the essential properties for creating a practical and secure data trust framework by transforming current auditing practices and automatic enforcement of smart contracts logic without relying on intermediaries to establish trust. Blockchain holds an enormous potential to remove the barriers of traditional centralized applications and propose a distributed and transparent administration by employing the involved parties to maintain consensus on the ledger. Furthermore, smart contracts are a programmable component that provides blockchain with more flexible and powerful capabilities. Recent advances in blockchain platforms toward smart contracts' development have revealed the possibility of implementing blockchain-based applications in various domains, such as health care, supply chain and digital identity. This dissertation investigates the blockchain's potential to present a framework for data trust. It starts with a comprehensive study of smart contracts as the main component of blockchain for developing decentralized data trust. Interrelated, three decentralized applications that address data sharing and access control problems in various fields, including healthcare data sharing, business process, and physical access control system, have been developed and examined. In addition, a general-purpose application based on an attribute-based access control model is proposed that can provide trusted auditability required for data sharing and access control systems and, ultimately, a data trust framework. Besides auditing, the system presents a transparency level that both access requesters (data users) and resource owners (data controllers) can benefit from. The proposed solutions have been validated through a use case of independent digital libraries. It also provides a detailed performance analysis of the system implementation. The performance results have been compared based on different consensus mechanisms and databases, indicating the system's high throughput and low latency. Finally, this dissertation presents an end-to-end data trust framework based on blockchain technology. The proposed framework promotes data trustworthiness by assessing input datasets, effectively managing access control, and presenting data provenance and activity monitoring. A trust assessment model that examines the trustworthiness of input data sets and calculates the trust value is presented. The number of transaction validators is defined adaptively with the trust value. This research provides solutions for both data owners and data users’ by ensuring the trustworthiness and quality of the data at origin and transparent and secure usage of the data at the end. A comprehensive experimental study indicates the presented system effectively handles a large number of transactions with low latency

    A Forensic Enabled Data Provenance Model for Public Cloud

    Get PDF
    Cloud computing is a newly emerging technology where storage, computation and services are extensively shared among a large number of users through virtualization and distributed computing. This technology makes the process of detecting the physical location or ownership of a particular piece of data even more complicated. As a result, improvements in data provenance techniques became necessary. Provenance refers to the record describing the origin and other historical information about a piece of data. An advanced data provenance system will give forensic investigators a transparent idea about the data\u27s lineage, and help to resolve disputes over controversial pieces of data by providing digital evidence. In this paper, the challenges of cloud architecture are identified, how this affects the existing forensic analysis and provenance techniques is discussed, and a model for efficient provenance collection and forensic analysis is proposed

    Systematizing Decentralization and Privacy: Lessons from 15 Years of Research and Deployments

    Get PDF
    Decentralized systems are a subset of distributed systems where multiple authorities control different components and no authority is fully trusted by all. This implies that any component in a decentralized system is potentially adversarial. We revise fifteen years of research on decentralization and privacy, and provide an overview of key systems, as well as key insights for designers of future systems. We show that decentralized designs can enhance privacy, integrity, and availability but also require careful trade-offs in terms of system complexity, properties provided, and degree of decentralization. These trade-offs need to be understood and navigated by designers. We argue that a combination of insights from cryptography, distributed systems, and mechanism design, aligned with the development of adequate incentives, are necessary to build scalable and successful privacy-preserving decentralized systems
    • …
    corecore