12,057 research outputs found

    "If You Can't Beat them, Join them": A Usability Approach to Interdependent Privacy in Cloud Apps

    Get PDF
    Cloud storage services, like Dropbox and Google Drive, have growing ecosystems of 3rd party apps that are designed to work with users' cloud files. Such apps often request full access to users' files, including files shared with collaborators. Hence, whenever a user grants access to a new vendor, she is inflicting a privacy loss on herself and on her collaborators too. Based on analyzing a real dataset of 183 Google Drive users and 131 third party apps, we discover that collaborators inflict a privacy loss which is at least 39% higher than what users themselves cause. We take a step toward minimizing this loss by introducing the concept of History-based decisions. Simply put, users are informed at decision time about the vendors which have been previously granted access to their data. Thus, they can reduce their privacy loss by not installing apps from new vendors whenever possible. Next, we realize this concept by introducing a new privacy indicator, which can be integrated within the cloud apps' authorization interface. Via a web experiment with 141 participants recruited from CrowdFlower, we show that our privacy indicator can significantly increase the user's likelihood of choosing the app that minimizes her privacy loss. Finally, we explore the network effect of History-based decisions via a simulation on top of large collaboration networks. We demonstrate that adopting such a decision-making process is capable of reducing the growth of users' privacy loss by 70% in a Google Drive-based network and by 40% in an author collaboration network. This is despite the fact that we neither assume that users cooperate nor that they exhibit altruistic behavior. To our knowledge, our work is the first to provide quantifiable evidence of the privacy risk that collaborators pose in cloud apps. We are also the first to mitigate this problem via a usable privacy approach.Comment: Authors' extended version of the paper published at CODASPY 201

    Investigating the tension between cloud-related actors and individual privacy rights

    Get PDF
    Historically, little more than lip service has been paid to the rights of individuals to act to preserve their own privacy. Personal information is frequently exploited for commercial gain, often without the person’s knowledge or permission. New legislation, such as the EU General Data Protection Regulation Act, has acknowledged the need for legislative protection. This Act places the onus on service providers to preserve the confidentiality of their users’ and customers’ personal information, on pain of punitive fines for lapses. It accords special privileges to users, such as the right to be forgotten. This regulation has global jurisdiction covering the rights of any EU resident, worldwide. Assuring this legislated privacy protection presents a serious challenge, which is exacerbated in the cloud environment. A considerable number of actors are stakeholders in cloud ecosystems. Each has their own agenda and these are not necessarily well aligned. Cloud service providers, especially those offering social media services, are interested in growing their businesses and maximising revenue. There is a strong incentive for them to capitalise on their users’ personal information and usage information. Privacy is often the first victim. Here, we examine the tensions between the various cloud actors and propose a framework that could be used to ensure that privacy is preserved and respected in cloud systems

    Federated Robust Embedded Systems: Concepts and Challenges

    Get PDF
    The development within the area of embedded systems (ESs) is moving rapidly, not least due to falling costs of computation and communication equipment. It is believed that increased communication opportunities will lead to the future ESs no longer being parts of isolated products, but rather parts of larger communities or federations of ESs, within which information is exchanged for the benefit of all participants. This vision is asserted by a number of interrelated research topics, such as the internet of things, cyber-physical systems, systems of systems, and multi-agent systems. In this work, the focus is primarily on ESs, with their specific real-time and safety requirements. While the vision of interconnected ESs is quite promising, it also brings great challenges to the development of future systems in an efficient, safe, and reliable way. In this work, a pre-study has been carried out in order to gain a better understanding about common concepts and challenges that naturally arise in federations of ESs. The work was organized around a series of workshops, with contributions from both academic participants and industrial partners with a strong experience in ES development. During the workshops, a portfolio of possible ES federation scenarios was collected, and a number of application examples were discussed more thoroughly on different abstraction levels, starting from screening the nature of interactions on the federation level and proceeding down to the implementation details within each ES. These discussions led to a better understanding of what can be expected in the future federated ESs. In this report, the discussed applications are summarized, together with their characteristics, challenges, and necessary solution elements, providing a ground for the future research within the area of communicating ESs

    The Curious Case of the PDF Converter that Likes Mozart: Dissecting and Mitigating the Privacy Risk of Personal Cloud Apps

    Get PDF
    Third party apps that work on top of personal cloud services such as Google Drive and Dropbox, require access to the user's data in order to provide some functionality. Through detailed analysis of a hundred popular Google Drive apps from Google's Chrome store, we discover that the existing permission model is quite often misused: around two thirds of analyzed apps are over-privileged, i.e., they access more data than is needed for them to function. In this work, we analyze three different permission models that aim to discourage users from installing over-privileged apps. In experiments with 210 real users, we discover that the most successful permission model is our novel ensemble method that we call Far-reaching Insights. Far-reaching Insights inform the users about the data-driven insights that apps can make about them (e.g., their topics of interest, collaboration and activity patterns etc.) Thus, they seek to bridge the gap between what third parties can actually know about users and users perception of their privacy leakage. The efficacy of Far-reaching Insights in bridging this gap is demonstrated by our results, as Far-reaching Insights prove to be, on average, twice as effective as the current model in discouraging users from installing over-privileged apps. In an effort for promoting general privacy awareness, we deploy a publicly available privacy oriented app store that uses Far-reaching Insights. Based on the knowledge extracted from data of the store's users (over 115 gigabytes of Google Drive data from 1440 users with 662 installed apps), we also delineate the ecosystem for third-party cloud apps from the standpoint of developers and cloud providers. Finally, we present several general recommendations that can guide other future works in the area of privacy for the cloud

    A gap analysis of Internet-of-Things platforms

    Full text link
    We are experiencing an abundance of Internet-of-Things (IoT) middleware solutions that provide connectivity for sensors and actuators to the Internet. To gain a widespread adoption, these middleware solutions, referred to as platforms, have to meet the expectations of different players in the IoT ecosystem, including device providers, application developers, and end-users, among others. In this article, we evaluate a representative sample of these platforms, both proprietary and open-source, on the basis of their ability to meet the expectations of different IoT users. The evaluation is thus more focused on how ready and usable these platforms are for IoT ecosystem players, rather than on the peculiarities of the underlying technological layers. The evaluation is carried out as a gap analysis of the current IoT landscape with respect to (i) the support for heterogeneous sensing and actuating technologies, (ii) the data ownership and its implications for security and privacy, (iii) data processing and data sharing capabilities, (iv) the support offered to application developers, (v) the completeness of an IoT ecosystem, and (vi) the availability of dedicated IoT marketplaces. The gap analysis aims to highlight the deficiencies of today's solutions to improve their integration to tomorrow's ecosystems. In order to strengthen the finding of our analysis, we conducted a survey among the partners of the Finnish IoT program, counting over 350 experts, to evaluate the most critical issues for the development of future IoT platforms. Based on the results of our analysis and our survey, we conclude this article with a list of recommendations for extending these IoT platforms in order to fill in the gaps.Comment: 15 pages, 4 figures, 3 tables, Accepted for publication in Computer Communications, special issue on the Internet of Things: Research challenges and solution

    Personal data broker instead of blockchain for students’ data privacy assurance

    Get PDF
    Data logs about learning activities are being recorded at a growing pace due to the adoption and evolution of educational technologies (Edtech). Data analytics has entered the field of education under the name of learning analytics. Data analytics can provide insights that can be used to enhance learning activities for educational stakeholders, as well as helping online learning applications providers to enhance their services. However, despite the goodwill in the use of Edtech, some service providers use it as a means to collect private data about the students for their own interests and benefits. This is showcased in recent cases seen in media of bad use of students’ personal information. This growth in cases is due to the recent tightening in data privacy regulations, especially in the EU. The students or their parents should be the owners of the information about them and their learning activities online. Thus they should have the right tools to control how their information is accessed and for what purposes. Currently, there is no technological solution to prevent leaks or the misuse of data about the students or their activity. It seems appropriate to try to solve it from an automation technology perspective. In this paper, we consider the use of Blockchain technologies as a possible basis for a solution to this problem. Our analysis indicates that the Blockchain is not a suitable solution. Finally, we propose a cloud-based solution with a central personal point of management that we have called Personal Data Broker.Peer ReviewedPostprint (author's final draft
    • …
    corecore