724 research outputs found

    Auditable Compressed Storage

    Get PDF
    Outsourcing data to the cloud for personal use is becoming an everyday trend rather than an extreme scenario. The frequent outsourcing of data increases the possible attack window because users do not fully control their personal files. Typically, once there are established secure channels between two endpoints, communication is considered secure. However, in the cloud model the receiver–the cloud–cannot be fully trusted, either because it has been under adversarial control, or because it acts maliciously to increase its revenue by deleting infrequent accessed file blocks. One approach used by current literature to address the aforementioned security concerns is via Remote Data Integrity Checking (RDIC) protocols, whereby a data owner can challenge an untrusted cloud service provider (CSP) to prove faithful storage of its data. Current RDIC protocols assume that the original data format remains unchanged. However, users may wish to compress their data in order to enjoy less charges. In that case, current RDIC protocols become impractical because, each time compression happens on a file, the user has to run a new RDIC protocol. In this work we initiate the study for Auditable Compressed Storage (ACS). After defining the new model we instantiate two protocols for different widely used compression techniques: run length encoding and Huffman encoding. In contrast with conventional RDIC, our protocols allow a user to delegate the compression to the cloud in a provably secure way: The client can verify correctness of compression without having to download the entire uncompressed file and check it against the compressed one

    Efficient, Dependable Storage of Human Genome Sequencing Data

    Get PDF
    A compreensão do genoma humano impacta várias áreas da vida. Os dados oriundos do genoma humano são enormes pois existem milhões de amostras a espera de serem sequenciadas e cada genoma humano sequenciado pode ocupar centenas de gigabytes de espaço de armazenamento. Os genomas humanos são críticos porque são extremamente valiosos para a investigação e porque podem fornecer informações delicadas sobre o estado de saúde dos indivíduos, identificar os seus dadores ou até mesmo revelar informações sobre os parentes destes. O tamanho e a criticidade destes genomas, para além da quantidade de dados produzidos por instituições médicas e de ciências da vida, exigem que os sistemas informáticos sejam escaláveis, ao mesmo tempo que sejam seguros, confiáveis, auditáveis e com custos acessíveis. As infraestruturas de armazenamento existentes são tão caras que não nos permitem ignorar a eficiência de custos no armazenamento de genomas humanos, assim como em geral estas não possuem o conhecimento e os mecanismos adequados para proteger a privacidade dos dadores de amostras biológicas. Esta tese propõe um sistema de armazenamento de genomas humanos eficiente, seguro e auditável para instituições médicas e de ciências da vida. Ele aprimora os ecossistemas de armazenamento tradicionais com técnicas de privacidade, redução do tamanho dos dados e auditabilidade a fim de permitir o uso eficiente e confiável de infraestruturas públicas de computação em nuvem para armazenar genomas humanos. As contribuições desta tese incluem (1) um estudo sobre a sensibilidade à privacidade dos genomas humanos; (2) um método para detetar sistematicamente as porções dos genomas que são sensíveis à privacidade; (3) algoritmos de redução do tamanho de dados, especializados para dados de genomas sequenciados; (4) um esquema de auditoria independente para armazenamento disperso e seguro de dados; e (5) um fluxo de armazenamento completo que obtém garantias razoáveis de proteção, segurança e confiabilidade a custos modestos (por exemplo, menos de 1/Genoma/Ano),integrandoosmecanismospropostosaconfigurac\co~esdearmazenamentoapropriadasTheunderstandingofhumangenomeimpactsseveralareasofhumanlife.Datafromhumangenomesismassivebecausetherearemillionsofsamplestobesequenced,andeachsequencedhumangenomemaysizehundredsofgigabytes.Humangenomesarecriticalbecausetheyareextremelyvaluabletoresearchandmayprovidehintsonindividualshealthstatus,identifytheirdonors,orrevealinformationaboutdonorsrelatives.Theirsizeandcriticality,plustheamountofdatabeingproducedbymedicalandlifesciencesinstitutions,requiresystemstoscalewhilebeingsecure,dependable,auditable,andaffordable.Currentstorageinfrastructuresaretooexpensivetoignorecostefficiencyinstoringhumangenomes,andtheylacktheproperknowledgeandmechanismstoprotecttheprivacyofsampledonors.Thisthesisproposesanefficientstoragesystemforhumangenomesthatmedicalandlifesciencesinstitutionsmaytrustandafford.Itenhancestraditionalstorageecosystemswithprivacyaware,datareduction,andauditabilitytechniquestoenabletheefficient,dependableuseofmultitenantinfrastructurestostorehumangenomes.Contributionsfromthisthesisinclude(1)astudyontheprivacysensitivityofhumangenomes;(2)todetectgenomesprivacysensitiveportionssystematically;(3)specialiseddatareductionalgorithmsforsequencingdata;(4)anindependentauditabilityschemeforsecuredispersedstorage;and(5)acompletestoragepipelinethatobtainsreasonableprivacyprotection,security,anddependabilityguaranteesatmodestcosts(e.g.,lessthan1/Genoma/Ano), integrando os mecanismos propostos a configurações de armazenamento apropriadasThe understanding of human genome impacts several areas of human life. Data from human genomes is massive because there are millions of samples to be sequenced, and each sequenced human genome may size hundreds of gigabytes. Human genomes are critical because they are extremely valuable to research and may provide hints on individuals’ health status, identify their donors, or reveal information about donors’ relatives. Their size and criticality, plus the amount of data being produced by medical and life-sciences institutions, require systems to scale while being secure, dependable, auditable, and affordable. Current storage infrastructures are too expensive to ignore cost efficiency in storing human genomes, and they lack the proper knowledge and mechanisms to protect the privacy of sample donors. This thesis proposes an efficient storage system for human genomes that medical and lifesciences institutions may trust and afford. It enhances traditional storage ecosystems with privacy-aware, data-reduction, and auditability techniques to enable the efficient, dependable use of multi-tenant infrastructures to store human genomes. Contributions from this thesis include (1) a study on the privacy-sensitivity of human genomes; (2) to detect genomes’ privacy-sensitive portions systematically; (3) specialised data reduction algorithms for sequencing data; (4) an independent auditability scheme for secure dispersed storage; and (5) a complete storage pipeline that obtains reasonable privacy protection, security, and dependability guarantees at modest costs (e.g., less than 1/Genome/Year) by integrating the proposed mechanisms with appropriate storage configurations

    Role Based Secure Data Access Control for Cost Optimized Cloud Storage Using Data Fragmentation While Maintaining Data Confidentiality

    Get PDF
    The paper proposes a role-based secure data access control framework for cost-optimized cloud storage, addressing the challenge of maintaining data security, privacy, integrity, and availability at lower cost. The proposed framework incorporates a secure authenticity scheme to protect data during storage or transfer over the cloud. The framework leverages storage cost optimization by compressing high-resolution images and fragmenting them into multiple encrypted chunks using the owner's private key. The proposed approach offers two layers of security, ensuring that only authorized users can decrypt and reconstruct data into its original format. The implementation results depicts that the proposed scheme outperforms existing systems in various aspects, making it a reliable solution for cloud service providers to enhance data security while reducing storage costs

    A Secure Storage Management & Auditing Scheme for Cloud Storage

    Get PDF
    Cloud computing is an evolving domain that provides many on-demand services that are used by many businesses on daily basis. Massive growth in cloud storage results in new data centers which are hosted by a large number of servers. As number of data centers increases enormous amount of energy consumption also increases. Now cloud service providers are looking for environmental friendly alternatives to reduce energy consumption. Data storage requires huge amount of resources and management. Due to increasing amount of demand for data storage new frameworks needed to store and manage data at a low cost. Also to prevent data from unauthorized access cloud service provider must provide data access control. Data access control is an effective way to ensure data storage security within cloud. For data storage cost minimization we are using DCT compression technique to ensure data compression without compromising the quality of the data. For data access control and security asymmetric cryptographic algorithm RSA is used. For data auditing we have used MD5 with RSA to generate digital signatures, In proposed work we tried to cover all attributes in terms of efficiency, performance and security in cloud computing

    Droplet: Decentralized Authorization for IoT Data Streams

    Full text link
    This paper presents Droplet, a decentralized data access control service, which operates without intermediate trust entities. Droplet enables data owners to securely and selectively share their encrypted data while guaranteeing data confidentiality against unauthorized parties. Droplet's contribution lies in coupling two key ideas: (i) a new cryptographically-enforced access control scheme for encrypted data streams that enables users to define fine-grained stream-specific access policies, and (ii) a decentralized authorization service that handles user-defined access policies. In this paper, we present Droplet's design, the reference implementation of Droplet, and experimental results of three case-study apps atop of Droplet: Fitbit activity tracker, Ava health tracker, and ECOviz smart meter dashboard

    Auditable Augmented/Mixed/Virtual Reality : The Practicalities of Mobile System Transparency

    Get PDF
    Funding Information: We acknowledge the financial support of the UK Engineering & Physical Sciences Research Council (EP/P024394/1, EP/R033501/1) and Microsoft via the Microsoft Cloud Computing Research Centre.Peer reviewe

    A Simple Auditable Fingerprint Authentication Scheme Using Smart-Contracts

    Get PDF
    Biometric authentication, and notably using fingerprints, are now common. Despite its usability, biometrics have however a caveat which is the impossibility of revocation: once the raw fingerprint is breached, and depending on the technology of the reader, it is impossible to stop an illegitimate authentication. This places a focus on auditing both to detect fraud and to have clear indications that the fingerprint has been breached. In this paper we show how to take advantage of the immutability property of Blockchains to design an auditable protocol based on Diffie-Hellman key exchange with applications to fingerprint authentication

    Enabling Interactive Analytics of Secure Data using Cloud Kotta

    Full text link
    Research, especially in the social sciences and humanities, is increasingly reliant on the application of data science methods to analyze large amounts of (often private) data. Secure data enclaves provide a solution for managing and analyzing private data. However, such enclaves do not readily support discovery science---a form of exploratory or interactive analysis by which researchers execute a range of (sometimes large) analyses in an iterative and collaborative manner. The batch computing model offered by many data enclaves is well suited to executing large compute tasks; however it is far from ideal for day-to-day discovery science. As researchers must submit jobs to queues and wait for results, the high latencies inherent in queue-based, batch computing systems hinder interactive analysis. In this paper we describe how we have augmented the Cloud Kotta secure data enclave to support collaborative and interactive analysis of sensitive data. Our model uses Jupyter notebooks as a flexible analysis environment and Python language constructs to support the execution of arbitrary functions on private data within this secure framework.Comment: To appear in Proceedings of Workshop on Scientific Cloud Computing, Washington, DC USA, June 2017 (ScienceCloud 2017), 7 page

    MOF-BC: A Memory Optimized and Flexible BlockChain for Large Scale Networks

    Full text link
    BlockChain (BC) immutability ensures BC resilience against modification or removal of the stored data. In large scale networks like the Internet of Things (IoT), however, this feature significantly increases BC storage size and raises privacy challenges. In this paper, we propose a Memory Optimized and Flexible BC (MOF-BC) that enables the IoT users and service providers to remove or summarize their transactions and age their data and to exercise the "right to be forgotten". To increase privacy, a user may employ multiple keys for different transactions. To allow for the removal of stored transactions, all keys would need to be stored which complicates key management and storage. MOF-BC introduces the notion of a Generator Verifier (GV) which is a signed hash of a Generator Verifier Secret (GVS). The GV changes for each transaction to provide privacy yet is signed by a unique key, thus minimizing the information that needs to be stored. A flexible transaction fee model and a reward mechanism is proposed to incentivize users to participate in optimizing memory consumption. Qualitative security and privacy analysis demonstrates that MOF-BC is resilient against several security attacks. Evaluation results show that MOF-BC decreases BC memory consumption by up to 25\% and the user cost by more than two orders of magnitude compared to conventional BC instantiations
    corecore