6 research outputs found

    Token Attempt: The Misrepresentation of Website Privacy Policies through the Misuse of P3P Compact Policy Tokens (CMU-Cylab-10-014)

    No full text
    Platform for Privacy Preferences (P3P) compact policies (CPs) are a collection of three-character and four-character tokens that summarize a website's privacy policy pertaining to cookies. User agents, including Microsoft's Internet Explorer (IE) web browser, use CPs to evaluate websites' data collection practices and allow, reject, or modify cookies based on sites' privacy practices. CPs can provide a technical means to enforce users' privacy preferences if CPs accurately reflect websites' practices. Through automated analysis we can identify CPs that are erroneous due to syntax errors or semantic conflicts. We collected CPs from 33,139 websites and detected errors in 11,176 of them, including 134 TRUSTe-certified websites and 21 of the top 100 most-visited sites. Our work identifies potentially misleading practices by web administrators, as well as common accidental mistakes. We found thousands of sites using identical invalid CPs that had been recommended as workarounds for IE cookie blocking. Other sites had CPs with typos in their tokens, or other errors. 98% of invalid CPs resulted in cookies remaining unblocked by IE under it's default cookie settings. It appears that large numbers of websites that use CPs are misrepresenting their privacy practices, thus misleading users and rendering privacy protection tools ineffective. Unless regulators use their authority to take action against companies that provide erroneous machine-readable policies, users will be unable to rely on these policies

    Smart, Useful, Scary, Creepy: Perceptions of Online Behavioral Advertising (CMU-CyLab-12-007)

    No full text
    <p>We report results of 48 semi-structured interviews about online behavioral advertising (OBA). We investigate non-technical users' attitudes about OBA, then explain these attitudes by delving into users' understanding of its practice. Participants were surprised that their browsing history is currently used to tailor advertisements. They were unable to determine accurately what information is collected during OBA, assuming that advertisers collect more information than they actually do. Participants also misunderstood the role of advertising networks, basing their opinions of an advertising company on that company’s non-advertising activities. Furthermore, participants were unfamiliar with advertising industry icons intended to notify them when ads are behaviorally targeted, often believing that these icons were intended for advertisers, not for users. While many participants felt tailored advertising could benefit them, existing notice and choice mechanisms are not effectively reaching users. Our results suggest new directions both for providing users with effective notice about OBA and for the design of usable privacy tools that help consumers express their preferences about online behavioral advertising.</p

    What Do Online Behavioral Advertising Disclosures Communicate to Users? (CMU-CyLab-12-008)

    No full text
    <p>Online Behavioral Advertising (OBA) is the practice of tailoring ads based on an individual's online activities. We conducted a 1,505-participant online study to investigate Internet users' perceptions of OBA disclosures while performing an online task. We tested icons, accompanying taglines, and landing pages intended to inform users about OBA and provide opt-out options; these were based on prior research or drawn from those currently in use. The icons, taglines, and landing pages fell short both in terms of notifying participants about OBA and clearly informing participants about their choices. Half of the participants remembered the ads they saw but only 12% correctly remembered the disclosure taglines attached to ads. The majority of participants mistakenly believed that ads would pop up if they clicked on disclosure icons and taglines, and more participants incorrectly thought that clicking the disclosures would let them purchase their own advertisements than correctly understood that they could then opt out of OBA. "Ad-Choices," the tagline most commonly used by online advertisers, was particularly ineffective at communicating notice and choice. 45% of participants who saw "AdChoices" believed that it was intended to sell advertising space, while only 27% believed it was an avenue to stop tailored ads. A majority of participants mistakenly believed that opting out would stop all online tracking, not just tailored ads. We discuss challenges in crafting disclosures, and we provide suggestions for improvement.</p

    Why Johnny Can’t Opt Out: A Usability Evaluation of Tools to Limit Online Behavioral Advertising (CMU-CyLab-11-017)

    No full text
    We present results of a 45-participant laboratory study investigating the usability of tools to limit online behavioral advertising (OBA).We tested nine tools, including tools that block access to advertising websites, tools that set cookies indicating a user’s preference to opt out of OBA, and privacy tools that are built directly into web browsers. We interviewed participants about OBA, observed their behavior as they installed and used a privacy tool, and recorded their perceptions and attitudes about that tool. We found serious usability flaws in all nine tools we examined. The online opt-out tools were challenging for users to understand and configure. Users tend to be unfamiliar with most advertising companies, and therefore are unable to make meaningful choices. Users liked the fact that the browsers we tested had built-in Do Not Track features, but were wary of whether advertising companies would respect this preference. Users struggled to install and configure blocking lists to make effective use of blocking tools. They often erroneously concluded the tool they were using was blocking OBA when they had not properly configured it to do so.</p

    A Field Trial of Privacy Nudges for Facebook

    No full text
    <p>Anecdotal evidence and scholarly research have shown that Internet users may regret some of their online disclosures. To help individuals avoid such regrets, we designed two modifications to the Facebook web interface that nudge users to consider the content and audience of their online disclosures more carefully. We implemented and evaluated these two nudges in a 6-week field trial with 28 Facebook users. We analyzed participants' interactions with the nudges, the content of their posts, and opinions collected through surveys. We found that reminders about the audience of posts can prevent unintended disclosures without major burden; however, introducing a time delay before publishing users' posts can be perceived as both beneficial and annoying. On balance, some participants found the nudges helpful while others found them unnecessary or overly intrusive. We discuss implications and challenges for designing and evaluating systems to assist users with online disclosures.</p

    From Facebook Regrets to Facebook Privacy Nudges

    No full text
    <p>As social networking sites (SNSs) gain in popularity, instances of regrets following online (over)sharing continue to be reported. In June 2010, a pierogi mascot for the Pittsburgh Pirates was fired because he posted disparaging comments about the team on his Facebook page. More recently, a high school teacher was forced to resign because she posted a picture on Facebook in which she was holding a glass of wine and a mug of beer. These incidents illustrate how, in addition to fostering socialization and interaction between friends and strangers, the ease and immediacy of communication that SNSs make possible can sometimes also negatively impact their users.</p> <p>In this Article, we summarize empirical research that our team has conducted in the past few years, aimed at understanding what actions people regret having conducted in SNSs, and whether it is possible to help them avoid those regrets without diminishing the value users can extract from participating in these online communities. In particular, this Article is based on qualitative and quantitative studies investigating instances of regret on Facebook and alternatives to prevent it.</p
    corecore