4 research outputs found

    Mitigating the Risks of Smartphone Data Sharing: Identifying Opportunities and Evaluating Notice

    No full text
    <p>As smartphones become more ubiquitous, increasing amounts of information about smartphone users are created, collected, and shared. This information may pose privacy and security risks to the smartphone user. The risks may vary from government surveillance to theft of financial information. Previous work in the area of smartphone privacy and security has both identified specific security flaws and examined users’ expectations and behaviors. However, there has not been a broad examination of the smartphone ecosystem to determine the risks to users from smartphone data sharing and the possible mitigations. Two of the five studies in this work examine the smartphone data sharing ecosystem to identify risks and mitigations. The first study uses multi-stakeholder expert interviews to identify risks to users and the mitigations. A second study examines app developers in order to quantify the risky behaviors and identify opportunities to improve security and privacy. In the remaining three of five studies discussed in this work, we examine one specific risk mitigation that has been popular with policy-makers: privacy notices for consumers. If done well, privacy notices should inform smartphone users about the risks and allow them to make informed decisions about data collection. Unfortunately, previous research has found that existing privacy notices do not help smartphone users, as they are neither noticed nor understood. Through user studies, we evaluate options to improve notices. We identify opportunities to capture the attention of users and improve understanding by examining the timing and content of notices. Overall, this work attempts to inform public policy around smartphone privacy and security. We find novel opportunities to mitigate risks by understanding app developers’ work and behaviors. Also, recognizing the current focus on privacy notices, we attempt to frame the debate by examining how users’ attention to and comprehension of notices can be improved through content and timing.</p

    Is It the Typeset or the Type of Statistics? Disfluent Font and Self-Disclosure

    No full text
    <p><strong>Background.</strong> The security and privacy communities have become increasingly interested in results from behavioral economics and psychology to help frame decisions so that users can make better privacy and security choices. One such result in the literature suggests that cognitive disfluency (presenting questions in a hard-to-read font) reduces self-disclosure. (A. L. Alter and D. M. Oppenheimer. Suppressing secrecy through metacognitive ease cognitive fluency encourages self-disclosure. Psychological science, 20(11):1414-1420, 2009)</p> <p><strong>Aim.</strong> To examine the replicability and reliability of the effect of disfluency on self-disclosure, in order to test whether such approaches might be used to promote safer security and privacy behaviors.</p> <p><strong>Method.</strong> We conducted a series of survey studies on human subjects with two conditions - disfluent and fluent font. The surveys were completed online (390 participants throughout the United States), on tablets (93 students) and with pen and paper (three studies with 89, 61, and 59 students). The pen and paper studies replicated the original study exactly. We ran an independent samples t-test to check for significant differences between the averages of desirable responses across the two conditions.</p> <p><strong>Results.</strong> In all but one case, participants did not show lower self-disclosure rates under disfluent conditions using an independent samples t-test. We re-analyzed the original data and our data using the same statistical test (paired t-test) as used in the original paper, and only the data from the original published studies supported the hypothesis.</p

    The Privacy and Security Behaviors of Smartphone App Developers

    No full text
    <p>Smartphone app developers have to make many privacy-related decisions about what data to collect about endusers, and how that data is used. We explore how app developers make decisions about privacy and security. Additionally, we examine whether any privacy and security behaviors are related to characteristics of the app development companies. We conduct a series of interviews with 13 app developers to obtain rich qualitative information about privacy and security decision-making. We use an online survey of 228 app developers to quantify behaviors and test our hypotheses about the relationship between privacy and security behaviors and company characteristics. We find that smaller companies are less likely to demonstrate positive privacy and security behaviors. Additionally, although third-party tools for ads and analytics are pervasive, developers aren’t aware of the data collected by these tools. We suggest tools and opportunities to reduce the barriers for app developers to implement privacy and security best practices.</p

    Why Johnny Can’t Opt Out: A Usability Evaluation of Tools to Limit Online Behavioral Advertising (CMU-CyLab-11-017)

    No full text
    We present results of a 45-participant laboratory study investigating the usability of tools to limit online behavioral advertising (OBA).We tested nine tools, including tools that block access to advertising websites, tools that set cookies indicating a user’s preference to opt out of OBA, and privacy tools that are built directly into web browsers. We interviewed participants about OBA, observed their behavior as they installed and used a privacy tool, and recorded their perceptions and attitudes about that tool. We found serious usability flaws in all nine tools we examined. The online opt-out tools were challenging for users to understand and configure. Users tend to be unfamiliar with most advertising companies, and therefore are unable to make meaningful choices. Users liked the fact that the browsers we tested had built-in Do Not Track features, but were wary of whether advertising companies would respect this preference. Users struggled to install and configure blocking lists to make effective use of blocking tools. They often erroneously concluded the tool they were using was blocking OBA when they had not properly configured it to do so.</p
    corecore