4 research outputs found

    What do they know about me? Contents and Concerns of Online Behavioral Profiles (CMU-CyLab-14-011)

    No full text
    <p>Data aggregators collect large amounts of information about individual users from multiple sources, and create detailed online behavioral profiles of individuals. Behavioral profiles benefit users by improving products and services. However, they have also raised privacy concerns. To increase transparency, some companies are allowing users to access their behavioral profiles. In this work, we investigated behavioral profiles of users by utilizing these access mechanisms. Using in-person interviews (n=8), we analyzed the data shown in the profiles and compared it with the data that companies have about users. We elicited surprises and concerns from users about the data in their profiles, and estimated the accuracy of profiles. We conducted an online survey (n=100) to confirm observed surprises and concerns. Our results show a large gap between data shown in profiles and data possessed by companies. We also find that large number of profiles contain inaccuracies with levels as high as 80%. Participants expressed several concerns including collection of sensitive data such as credit and health information, extent of data collection and how their data may be used.</p

    Expecting the Unexpected: Understanding Mismatched Privacy Expectations Online

    No full text
    Online privacy policies are the primary mechanism for informing users about data practices of online services. In practice, users ignore privacy policies as policies are long and complex to read. Since users do not read privacy policies, their expectations regarding data practices of online services may not match a service's actual data practices. Mismatches may result in users exposing themselves to unanticipated privacy risks such as unknowingly sharing personal information with online services. One approach for mitigating privacy risks is to provide simplified privacy notices, in addition to privacy policies, that highlight unexpected data practices. However, identifying mismatches between user expectations and services' practices is challenging. We propose and validate a practical approach for studying Web users' privacy expectations and identifying mismatches with practices stated in privacy policies. We conducted a user study with 240 participants and 16 websites, and identified mismatches in collection, sharing and deletion data practices. We discuss the implications of our results for the design of usable privacy notices, service providers, as well as public policy

    Your Location has been Shared 5,398 Times! A Field Study on Mobile App Privacy Nudging (CMU-ISR-14-116)

    No full text
    <p>Smartphone users are often unaware of the data collected by apps running on their devices. We report on a study that evaluates the benefits of giving users an app permission manager and of sending them nudges intended to raise their awareness of the data collected by their apps. Our study provides both qualitative and quantitative evidence that these approaches are complementary and can each play a significant role in empowering users to more effectively control their privacy. For instance, even after a week with access to the permission manager, participants benefited from nudges showing them how often some of their sensitive data was being accessed by apps, with 95% of participants reassessing their permissions, and 58% of them further restricting some of their permissions. We discuss how participants interacted both with the permission manager and the privacy nudges, analyze the effectiveness of both solutions and derive some recommendations.</p

    Follow My Recommendations: A Personalized Privacy Assistant for Mobile App Permissions

    No full text
    Modern smartphone platforms have millions of apps, many of which request permissions to access private data and resources, like user accounts or location. While these smartphone platforms provide varying degrees of control over these permissions, the sheer number of decisions that users are expected to manage has been shown to be unrealistically high. Prior research has shown that users are often unaware of, if not uncomfortable with, many of their permission settings. Prior work also suggests that it is theoretically possible to predict many of the privacy settings a user would want by asking the user a small number of questions. However, this approach has neither been operationalized nor evaluated with actual users before. We report on a field study (n=72) in which we implemented and evaluated a Personalized Privacy Assistant (PPA) with participants using their own Android devices. The results of our study are encouraging. We find that 78.7% of the recommendations made by the PPA were adopted by users. Following initial recommendations on permission settings, participants were motivated to further review and modify their settings with daily “privacy nudges.” Despite showing substantial engagement with these nudges, participants only changed 5.1% of the settings previously adopted based on the PPA's recommendations. The PPA and its recommendations were perceived as useful and usable. We discuss the implications of our results for mobile permission management and the design of personalized privacy assistant solutionsPresented at the 12th Symposium on Usable Privacy and Security (SOUPS) 2016, June 22–24, 2016, Denver, Colorado.</div
    corecore