2,117 research outputs found

    "If You Can't Beat them, Join them": A Usability Approach to Interdependent Privacy in Cloud Apps

    Get PDF
    Cloud storage services, like Dropbox and Google Drive, have growing ecosystems of 3rd party apps that are designed to work with users' cloud files. Such apps often request full access to users' files, including files shared with collaborators. Hence, whenever a user grants access to a new vendor, she is inflicting a privacy loss on herself and on her collaborators too. Based on analyzing a real dataset of 183 Google Drive users and 131 third party apps, we discover that collaborators inflict a privacy loss which is at least 39% higher than what users themselves cause. We take a step toward minimizing this loss by introducing the concept of History-based decisions. Simply put, users are informed at decision time about the vendors which have been previously granted access to their data. Thus, they can reduce their privacy loss by not installing apps from new vendors whenever possible. Next, we realize this concept by introducing a new privacy indicator, which can be integrated within the cloud apps' authorization interface. Via a web experiment with 141 participants recruited from CrowdFlower, we show that our privacy indicator can significantly increase the user's likelihood of choosing the app that minimizes her privacy loss. Finally, we explore the network effect of History-based decisions via a simulation on top of large collaboration networks. We demonstrate that adopting such a decision-making process is capable of reducing the growth of users' privacy loss by 70% in a Google Drive-based network and by 40% in an author collaboration network. This is despite the fact that we neither assume that users cooperate nor that they exhibit altruistic behavior. To our knowledge, our work is the first to provide quantifiable evidence of the privacy risk that collaborators pose in cloud apps. We are also the first to mitigate this problem via a usable privacy approach.Comment: Authors' extended version of the paper published at CODASPY 201

    Valuation of Personal Data in the Age of Data Ownership

    Get PDF
    In order to tackle uncertainties about data ownership and data misuse, more accessible and competitive data markets are proposed, especially concerning the use and access rights of data generated by the Internet of Things (IoT) devices. Legal proposals suggest that companies and individuals become owners of their self-generated data, enabling new ways of data monetization. Still, individuals are often uncertain about the value and price of their own generated data. This research builds on construal level theory to propose influencing factors fostering an understanding of intraindividual data value. The results of a pilot study survey (n = 104), conducted during the ICIS 2022, show that data proximity and data sensitivity factors significantly influence intraindividual data value. Our research extends the knowledge on data value from individual perspectives and builds the foundation for future work on data valuation and pricing in intraindividual data trading

    Developing a Psychometric Scale to Measure One’s Valuation of Other People’s Privacy

    Get PDF
    Researchers invested tremendous efforts in understanding and measuring people’s perceptions, concerns, attitudes, and behaviors related to privacy risks from data gathering by online platforms, mobile devices, and other technologies. However, technology users often risk other people’s privacy by sharing their data actively (e.g., posting photos taken at public places online) or passively (e.g., granting mobile apps to access stored contacts). Moreover, technologies that continuously sense the environment and record behaviors and activities of everyone around them (e.g., smart assistants) are becoming pervasive. Thus, an instrument to quantify how much one values other people’s privacy is essential to understand technology adoption, attitudes and behaviors related to collecting and sharing data about non-users, inform the design of adaptive privacy enhancing technologies, and developing personalized technological or behavioral interventions to raise awareness and mitigate privacy risks. This abstract details a preliminary study towards developing such as scale. We report the methods of generating the initial item pool and findings from a pilot survey. We hope to get feedback from the community to improve the research design during the poster presentation

    DO THEY REALLY CARE ABOUT TARGETED POLITICAL ADS? INVESTIGATION OF USER PRIVACY CONCERNS AND PREFERENCES

    Get PDF
    Reliance on targeted political ads has skyrocketed in recent years, leading to negative reactions in media and society. Nonetheless, only few studies investigate user privacy concerns and their role in user acceptance decisions in the context of online political targeting. To fill this gap, in this study we explore the magnitude of privacy concerns towards targeted political ads compared to “tradi-tional” targeting in the product context. Surprisingly, we find no notable differences in privacy concerns between these use purposes. In the next step, user preferences over ad types are elicited with the help of a discrete choice experiment in the mobile app adoption context. Among others, our findings from simulations on the basis of a mixed logit model cautiously suggest that while targeted political advertising is perceived as somewhat less desirable by respondents, its presence does not consequentially deter users from choosing such an app, with user preferences being high-ly volatile. Together, these results contribute to a better understanding of users’ privacy concerns and preferences in the context of targeted political advertising online. Acknowledgment This work has been funded by the Federal Ministry of Education and Research of Germany (BMBF) under grant no. 16DII116 (“Deutsches Internet-Institut”)

    Putting a Price Tag on Personal Information - A Literature Review

    Get PDF
    In the digital age, personal information is claimed to be the new commodity with a rising market demand and profitability for businesses. Simultaneously, people are becoming aware of the value of their personal information while being concerned about their privacy. This increases the demand of direct compensation or protection. In response to the commodification of privacy and the increased demand for compensation, a number of scholars have shed light on the value people assign to their personal information. However, these findings remain controversial as their results differ tremendously due to different research methods and contexts. To address this gap, we conducted a systematic literature review to gain insights into the current research state and to identify further research avenues. By synthesizing and analyzing 37 publications, we provide an integrative framework along with seven contextual factors affecting individuals’ valuation of privacy

    Your Data Is My Data: A Framework for Addressing Interdependent Privacy Infringements

    Get PDF
    Everyone holds personal information about others. Each person's privacy thus critically depends on the interplay of multiple actors. In an age of technology integration, this interdependence of data protection is becoming a major threat to privacy. Yet current regulation focuses on the sharing of information between two parties rather than multiactor situations. This study highlights how current policy inadequacies, illustrated by the European Union General Data Protection Regulation, can be overcome by means of a deeper understanding of the phenomenon. Specifically, the authors introduce a new phenomenological framework to explain interdependent infringements. This framework builds on parallels between property and privacy and suggests that interdependent peer protection necessitates three hierarchical steps, "the 3Rs": realize, recognize, and respect. In response to observed failures at these steps, the authors identify four classes of intervention that constitute a toolbox addressing what can be done by marketers, regulators, and privacy organizations. While the first three classes of interventions address issues arising from the corresponding 3Rs, the authors specifically advocate for a fourth class of interventions that proposes radical alternatives that shift the responsibilities for privacy protection away from consumers

    UNDERSTANDING USER PERCEPTIONS AND PREFERENCES FOR MASS-MARKET INFORMATION SYSTEMS – LEVERAGING MARKET RESEARCH TECHNIQUES AND EXAMPLES IN PRIVACY-AWARE DESIGN

    Get PDF
    With cloud and mobile computing, a new category of software products emerges as mass-market information systems (IS) that addresses distributed and heterogeneous end-users. Understanding user requirements and the factors that drive user adoption are crucial for successful design of such systems. IS research has suggested several theories and models to explain user adoption and intentions to use, among them the IS Success Model and the Technology Acceptance Model (TAM). Although these approaches contribute to theoretical understanding of the adoption and use of IS in mass-markets, they are criticized for not being able to drive actionable insights on IS design as they consider the IT artifact as a black-box (i.e., they do not sufficiently address the system internal characteristics). We argue that IS needs to embrace market research techniques to understand and empirically assess user preferences and perceptions in order to integrate the "voice of the customer" in a mass-market scenario. More specifically, conjoint analysis (CA), from market research, can add user preference measurements for designing high-utility IS. CA has gained popularity in IS research, however little guidance is provided for its application in the domain. We aim at supporting the design of mass-market IS by establishing a reliable understanding of consumer’s preferences for multiple factors combing functional, non-functional and economic aspects. The results include a “Framework for Conjoint Analysis Studies in IS” and methodological guidance for applying CA. We apply our findings to the privacy-aware design of mass-market IS and evaluate their implications on user adoption. We contribute to both academia and practice. For academia, we contribute to a more nuanced conceptualization of the IT artifact (i.e., system) through a feature-oriented lens and a preference-based approach. We provide methodological guidelines that support researchers in studying user perceptions and preferences for design variations and extending that to adoption. Moreover, the empirical studies for privacy- aware design contribute to a better understanding of the domain specific applications of CA for IS design and evaluation with a nuanced assessment of user preferences for privacy-preserving features. For practice, we propose guidelines for integrating the voice of the customer for successful IS design. -- Les technologies cloud et mobiles ont fait Ă©merger une nouvelle catĂ©gorie de produits informatiques qui s’adressent Ă  des utilisateurs hĂ©tĂ©rogĂšnes par le biais de systĂšmes d'information (SI) distribuĂ©s. Les termes “SI de masse” sont employĂ©s pour dĂ©signer ces nouveaux systĂšmes. Une conception rĂ©ussie de ceux-ci passe par une phase essentielle de comprĂ©hension des besoins et des facteurs d'adoption des utilisateurs. Pour ce faire, la recherche en SI suggĂšre plusieurs thĂ©ories et modĂšles tels que le “IS Success Model” et le “Technology Acceptance Model”. Bien que ces approches contribuent Ă  la comprĂ©hension thĂ©orique de l'adoption et de l'utilisation des SI de masse, elles sont critiquĂ©es pour ne pas ĂȘtre en mesure de fournir des informations exploitables sur la conception de SI car elles considĂšrent l'artefact informatique comme une boĂźte noire. En d’autres termes, ces approches ne traitent pas suffisamment des caractĂ©ristiques internes du systĂšme. Nous soutenons que la recherche en SI doit adopter des techniques d'Ă©tude de marchĂ© afin de mieux intĂ©grer les exigences du client (“Voice of Customer”) dans un scĂ©nario de marchĂ© de masse. Plus prĂ©cisĂ©ment, l'analyse conjointe (AC), issue de la recherche sur les consommateurs, peut contribuer au dĂ©veloppement de systĂšme SI Ă  forte valeur d'usage. Si l’AC a gagnĂ© en popularitĂ© au sein de la recherche en SI, des recommandations quant Ă  son utilisation dans ce domaine restent rares. Nous entendons soutenir la conception de SI de masse en facilitant une identification fiable des prĂ©fĂ©rences des consommateurs sur de multiples facteurs combinant des aspects fonctionnels, non-fonctionnels et Ă©conomiques. Les rĂ©sultats comprennent un “Cadre de rĂ©fĂ©rence pour les Ă©tudes d'analyse conjointe en SI” et des recommandations mĂ©thodologiques pour l'application de l’AC. Nous avons utilisĂ© ces contributions pour concevoir un SI de masse particuliĂšrement sensible au respect de la vie privĂ©e des utilisateurs et nous avons Ă©valuĂ© l’impact de nos recherches sur l'adoption de ce systĂšme par ses utilisateurs. Ainsi, notre travail contribue tant Ă  la thĂ©orie qu’à la pratique des SI. Pour le monde universitaire, nous contribuons en proposant une conceptualisation plus nuancĂ©e de l'artefact informatique (c'est-Ă -dire du systĂšme) Ă  travers le prisme des fonctionnalitĂ©s et par une approche basĂ©e sur les prĂ©fĂ©rences utilisateurs. Par ailleurs, les chercheurs peuvent Ă©galement s'appuyer sur nos directives mĂ©thodologiques pour Ă©tudier les perceptions et les prĂ©fĂ©rences des utilisateurs pour diffĂ©rentes variations de conception et Ă©tendre cela Ă  l'adoption. De plus, nos Ă©tudes empiriques sur la conception d’un SI de masse sensible au respect de la vie privĂ©e des utilisateurs contribuent Ă  une meilleure comprĂ©hension de l’application des techniques CA dans ce domaine spĂ©cifique. Nos Ă©tudes incluent notamment une Ă©valuation nuancĂ©e des prĂ©fĂ©rences des utilisateurs sur des fonctionnalitĂ©s de protection de la vie privĂ©e. Pour les praticiens, nous proposons des lignes directrices qui permettent d’intĂ©grer les exigences des clients afin de concevoir un SI rĂ©ussi

    Your data is (not) my data: The role of social value orientation in sharing data about others

    Get PDF
    The personal data consumers share with companies on a daily basis often also involves other people. However, prior research has focused almost exclusively on how consumers make decisions about their own data. In this research, we explore how consumers’ social value orientation impacts their decisions regarding data about others. In contrast to the notion of proselfs as “selfish” decision-makers, across four studies we find that proselfs are less likely than prosocials to share data about others with third parties. We show that this effect arises because proselfs feel less ownership over data they hold about others than prosocials, which in turn reduces their willingness to share it. Overall, this work contributes to literature on social value orientation as well as privacy decision-making and helps marketers and policy makers in designing interdependent privacy choice contexts

    Individuals as Gatekeepers Against Data Misuse

    Get PDF
    This article makes a case for treating individual data subjects as gatekeepers against misuse of personal data. Imposing gatekeeper responsibility on individuals is most useful where (a) the primary wrongdoers engage in data misuse intentionally or recklessly; (b) misuse of personal data is likely to lead to serious harm; and (c) one or more individuals are able to detect and prevent data misuse at a reasonable cost. As gatekeepers, individuals should have a legal duty to take reasonable measures to prevent data misuse where they are aware of facts indicating that the person seeking personal data from them is highly likely to misuse it or to facilitate its misuse. Recognizing a legal duty to prevent data misuse provides a framework for determining the boundaries of appropriate behavior when dealing with personal data that people have legally acquired. It does not, however, abrogate the need to impose gatekeeping obligations on big technology companies. In addition, individuals should also owe a social duty to protect the personal data in their possession. Whether individuals have sufficient incentive to protect their personal data in a particular situation depends not only on the cost of the relevant security measures, but also on their expectation of the security decisions made by others who also possess that data. Even a privacy conscious individual would have little incentive to invest in privacy protective measures if he believes that his personal data is possessed by a sufficiently large number of persons who do not invest in such measures. On the flip side, an individual’s decision to protect his personal data generates positive externalities—it incentivizes others to invest in security measures. As such, promoting the norm of data security is likely to lead to a self-reinforcing virtuous cycle which helps improve the level of data security in a given community

    When private information settles the bill : money and privacy in Google's market for smartphone applications

    Full text link
    We shed light on a money-for-privacy trade-off in the market for smartphone applications (“apps”). Developers offer their apps cheaper in return for greater access to personal information, and consumers choose between lower prices and more privacy. We provide evidence for this pattern using data on 300,000 mobile applications which were obtained from the Android Market in 2012 and 2014. We augmented these data with information from Alexa.com and Amazon Mechanical Turk. Our findings show that both the market’s supply and the demand side consider an app’s ability to collect private information, measured by their use of privacy-sensitive permissions: (1) cheaper apps use more privacy-sensitive permissions; (2) installation numbers are lower for apps with sensitive permissions; (3) circumstantial factors, such as the reputation of app developers, mitigate the strength of this relationship. Our results emerge consistently across several robustness checks, including the use of panel data analysis, the use of selected matched “twin”-pairs of apps and the use of various alternative measures of privacy-sensitiveness
    • 

    corecore