47 research outputs found

    A Design Space for Effective Privacy Notices.

    Get PDF
    ABSTRACT Notifying users about a system's data practices is supposed to enable users to make informed privacy decisions. Yet, current notice and choice mechanisms, such as privacy poli cies, are often ineffective because they are neither usable nor useful, and are therefore ignored by users. Constrained interfaces on mobile devices, wearables, and smart home de vices connected in an Internet of Things exacerbate the is sue. Much research has studied usability issues of privacy notices and many proposals for more usable privacy notices exist. Yet, there is little guidance for designers and develop ers on the design aspects that can impact the effectiveness of privacy notices. In this paper, we make multiple contribu tions to remedy this issue. We survey the existing literature on privacy notices and identify challenges, requirements, and best practices for privacy notice design. Further, we map out the design space for privacy notices by identifying relevant dimensions. This provides a taxonomy and consistent ter minology of notice approaches to foster understanding and reasoning about notice options available in the context of specific systems. Our systemization of knowledge and the developed design space can help designers, developers, and researchers identify notice and choice requirements and de velop a comprehensive notice concept for their system that addresses the needs of different audiences and considers the system's limitations and opportunities for providing notice

    A Design Space for Effective Privacy Notices.

    Get PDF
    ABSTRACT Notifying users about a system's data practices is supposed to enable users to make informed privacy decisions. Yet, current notice and choice mechanisms, such as privacy policies, are often ineffective because they are neither usable nor useful, and are therefore ignored by users. Constrained interfaces on mobile devices, wearables, and smart home devices connected in an Internet of Things exacerbate the issue. Much research has studied usability issues of privacy notices and many proposals for more usable privacy notices exist. Yet, there is little guidance for designers and developers on the design aspects that can impact the effectiveness of privacy notices. In this paper, we make multiple contributions to remedy this issue. We survey the existing literature on privacy notices and identify challenges, requirements, and best practices for privacy notice design. Further, we map out the design space for privacy notices by identifying relevant dimensions. This provides a taxonomy and consistent terminology of notice approaches to foster understanding and reasoning about notice options available in the context of specific systems. Our systemization of knowledge and the developed design space can help designers, developers, and researchers identify notice and choice requirements and develop a comprehensive notice concept for their system that addresses the needs of different audiences and considers the system's limitations and opportunities for providing notice

    The practical politics of sharing personal data

    Get PDF
    The focus of this paper is upon how people handle the sharing of personal data as an interactional concern. A number of ethnographic studies of domestic environments are drawn upon in order to articulate a range of circumstances under which data may be shared. In particular a distinction is made between the in situ sharing of data with others around you and the sharing of data with remote parties online. A distinction is also drawn between circumstances of purposefully sharing data in some way and circumstances where the sharing of data is incidental or even unwitting. On the basis of these studies a number of the organisational features of how people seek to manage the ways in which their data is shared are teased out. The paper then reflects upon how data sharing practices have evolved to handle the increasing presence of digital systems in people’s environments and how these relate to the ways in which people traditionally orient to the sharing of information. In conclusion a number of ways are pointed out in which the sharing of data remains problematic and there is a discussion of how systems may need to adapt to better support people’s data sharing practices in the future

    Enhancing security behaviour by supporting the user

    Get PDF
    Although the role of users in maintaining security is regularly emphasized, this is often not matched by an accompanying level of support. Indeed, users are frequently given insufficient guidance to enable effective security choices and decisions, which can lead to perceived bad behaviour as a consequence. This paper discusses the forms of support that are possible, and seeks to investigate the effect of doing so in practice. Specifically, it presents findings from two experimental studies that investigate how variations in password meter usage and feedback can positively affect the resulting password choices. The first experiment examines the difference between passwords selected by unguided users versus those receiving guidance and alternative forms of feedback (ranging from a traditional password meter through to an emoji-based approach). The findings reveal a 30% drop in weak password choices between unguided and guided usage, with the varying meters then delivering up to 10% further improvement. The second experiment then considers variations in the form of feedback message that users may receive in addition to a meter-based rating. It is shown that by providing richer information (e.g. based upon the time required to crack a password, its relative ranking against other choices, or the probability of it being cracked), users are more motivated towards making strong choices and changing initially weak ones. While the specifics of the experimental findings were focused upon passwords, the discussion also considers the benefits that may be gained by applying the same principles of nudging and guidance to other areas of security in which users are often found to have weak behaviours

    Simple Nudges for Better Password Creation

    Get PDF
    Recent security breaches have highlighted the consequences of reusing passwords across online accounts. Recent guidance on password policies by the UK government recommend an emphasis on password length over an extended character set for generating secure but memorable passwords without cognitive overload. This paper explores the role of three nudges in creating website-specific passwords: financial incentive (present vs absent), length instruction (long password vs no instruction) and stimulus (picture present vs not present). Mechanical Turk workers were asked to create a password in one of these conditions and the resulting passwords were evaluated based on character length, resistance to automated guessing attacks, and time taken to create the password. We found that users created longer passwords when asked to do so or when given a financial incentive and these longer passwords were harder to guess than passwords created with no instruction. Using a picture nudge to support password creation did not lead to passwords that were either longer or more resistant to attacks but did lead to account-specific passwords

    Mitigating the Risks of Smartphone Data Sharing: Identifying Opportunities and Evaluating Notice

    No full text
    <p>As smartphones become more ubiquitous, increasing amounts of information about smartphone users are created, collected, and shared. This information may pose privacy and security risks to the smartphone user. The risks may vary from government surveillance to theft of financial information. Previous work in the area of smartphone privacy and security has both identified specific security flaws and examined users’ expectations and behaviors. However, there has not been a broad examination of the smartphone ecosystem to determine the risks to users from smartphone data sharing and the possible mitigations. Two of the five studies in this work examine the smartphone data sharing ecosystem to identify risks and mitigations. The first study uses multi-stakeholder expert interviews to identify risks to users and the mitigations. A second study examines app developers in order to quantify the risky behaviors and identify opportunities to improve security and privacy. In the remaining three of five studies discussed in this work, we examine one specific risk mitigation that has been popular with policy-makers: privacy notices for consumers. If done well, privacy notices should inform smartphone users about the risks and allow them to make informed decisions about data collection. Unfortunately, previous research has found that existing privacy notices do not help smartphone users, as they are neither noticed nor understood. Through user studies, we evaluate options to improve notices. We identify opportunities to capture the attention of users and improve understanding by examining the timing and content of notices. Overall, this work attempts to inform public policy around smartphone privacy and security. We find novel opportunities to mitigate risks by understanding app developers’ work and behaviors. Also, recognizing the current focus on privacy notices, we attempt to frame the debate by examining how users’ attention to and comprehension of notices can be improved through content and timing.</p

    Designing Effective Privacy Notices and Controls

    No full text

    Is Your Inseam a Biometric? Evaluating the Understandability of Mobile Privacy Notice Categories (CMU-CyLab-13-011)

    No full text
    <p>The National Telecommunications and Information Administration (NTIA) has proposed a set of categories and definitions to create a United States national standard for short-form privacy notices on mobile devices. These notices are intended to facilitate user decision-making by categorizing both smartphone data to be shared and the entities with which that data is shared. In order to determine whether users consistently understand these proposed categories and their definitions, we conducted an online study with 791 participants. We found that participants had low agreement on how different data and entities should be categorized. We also compared our online results with those provided by four anonymous NTIA stakeholders, finding that even the stakeholders did not consistently categorize data or entities. Our work highlights areas of confusion for both survey participants and experts in the proposed scheme, and we offer suggestions for addressing these issues.</p
    corecore