3,089 research outputs found

    Security, Performance and Energy Trade-offs of Hardware-assisted Memory Protection Mechanisms

    Full text link
    The deployment of large-scale distributed systems, e.g., publish-subscribe platforms, that operate over sensitive data using the infrastructure of public cloud providers, is nowadays heavily hindered by the surging lack of trust toward the cloud operators. Although purely software-based solutions exist to protect the confidentiality of data and the processing itself, such as homomorphic encryption schemes, their performance is far from being practical under real-world workloads. The performance trade-offs of two novel hardware-assisted memory protection mechanisms, namely AMD SEV and Intel SGX - currently available on the market to tackle this problem, are described in this practical experience. Specifically, we implement and evaluate a publish/subscribe use-case and evaluate the impact of the memory protection mechanisms and the resulting performance. This paper reports on the experience gained while building this system, in particular when having to cope with the technical limitations imposed by SEV and SGX. Several trade-offs that provide valuable insights in terms of latency, throughput, processing time and energy requirements are exhibited by means of micro- and macro-benchmarks.Comment: European Commission Project: LEGaTO - Low Energy Toolset for Heterogeneous Computing (EC-H2020-780681

    Exploring Photo Privacy Protection on Smartphones

    Get PDF
    The proliferation of modern smartphone camera use in the past decade has resulted in unprecedented numbers of personal photos being taken and stored on popular devices. However, it has also caused privacy concerns. These photos sometimes contain potentially harmful information if they were to be leaked such as the personally identifiable information found on ID cards or in legal documents. With current security measures on iOS and Android phones, it is possible for 3rd party apps downloaded from official app stores or other locations to access the photo libraries on these devices without user knowledge or consent. Additionally, the prevalence of smartphone cameras in public has reduced personal privacy, as strangers are commonly photographed without permission. To mitigate the privacy risk posed by apps and unwanted public photos, this research project explores 3 main topics: developing a two-step method including permission analysis and system call analysis to identify the possibility of 3rd party applications accessing sensitive photos without user knowledge, developing an automated classifier to identify and protect private photos in smartphone media storage, and creating an accurate computer vision model for identifying bystanders in photos, so that their faces might be later blurred or otherwise obfuscated to protect their privacy. The resulting data from the system call analysis will hopefully improve public awareness on the vulnerabilities created by downloading untrustworthy apps. The private photo classifier and bystander detection model are able to achieve acceptable accuracy on the test datasets and can be used in future works to implement working systems to protect individual privacy in the aforementioned threat cases

    Ubiquitous Social Networks: Opportunities and Challenges for Privacy-Aware User Modelling

    Get PDF
    Privacy has been recognized as an important topic in the Internet for a long time, and technological developments in the area of privacy tools are ongoing. However, their focus was mainly on the individual. With the proliferation of social network sites, it has become more evident that the problem of privacy is not bounded by the perimeters of individuals but also by the privacy needs of their social networks. The objective of this paper is to contribute to the discussion about privacy in social network sites, a topic which we consider to be severely under-researched. We propose a framework for analyzing privacy requirements and for analyzing privacy-related data. We outline a combination of requirements analysis, conflict-resolution techniques, and a P3P extension that can contribute to privacy within such sites.World Wide Web, privacy, social network analysis, requirements analysis, privacy negotiation, ubiquity, P3P

    Designing Incentives Enabled Decentralized User Data Sharing Framework

    Get PDF
    Data sharing practices are much needed to strike a balance between user privacy, user experience, and profit. Different parties collect user data, for example, companies offering apps, social networking sites, and others, whose primary motive is an enhanced business model while giving optimal services to the end-users. However, the collection of user data is associated with serious privacy and security issues. The sharing platform also needs an effective incentive mechanism to realize transparent access to the user data while distributing fair incentives. The emerging literature on the topic includes decentralized data sharing approaches. However, there has been no universal method to track who shared what, to whom, when, for what purpose and under what condition in a verifiable manner until recently, when the distributed ledger technologies emerged to become the most effective means for designing a decentralized peer-to-peer network. This Ph.D. research includes an engineering approach for specifying the operations for designing incentives and user-controlled data-sharing platforms. The thesis presents a series of empirical studies and proposes novel blockchains- and smart contracts-based DUDS (Decentralized User Data Sharing) framework conceptualizing user-controlled data sharing practices. The DUDS framework supports immutability, authenticity, enhanced security, trusted records and is a promising means to share user data in various domains, including among researchers, customer data in e-commerce, tourism applications, etc. The DUDS framework is evaluated via performance analyses and user studies. The extended Technology Acceptance Model and a Trust-Privacy-Security Model are used to evaluate the usability of the DUDS framework. The evaluation allows uncovering the role of different factors affecting user intention to adopt data-sharing platforms. The results of the evaluation point to guidelines and methods for embedding privacy, user transparency, control, and incentives from the start in the design of a data-sharing framework to provide a platform that users can trust to protect their data while allowing them to control it and share it in the ways they want
    • 

    corecore