28 research outputs found
Notice and Choice Must Go: The Collective Control Alternative
Over twenty years of criticism conclusively confirm that Notice and Choice results in, as the law professor Fred Cate puts it, âthe worst of all worlds: privacy protection is not enhanced, individuals and businesses pay the cost of bureaucratic laws.â So why is it still the dominant legislative and regulatory approach to ensuring adequate informational privacy online? Recent implementations of Notice and Choice include the European Unionâs General Data Protection Regulation, and Californiaâs Consumer Protection Privacy Act. There is a well-known alternative (advanced by Helen Nissenbaum and others) that sees informational privacy as arising from social norms that require conformity to shared expectations about selective information flows.
So why have twenty years of criticism been so ineffective in turning the tide from Notice and Choice to the social norm alternative? One plausible factor is that the Notice and Choice criticisms detail the flaws but do not adequately motivate the turn to the social norms. A motivationally compelling critique would show how and why the failure of Notice and Choice, properly understood, reveals the undeniable need for the collective control alternative provide by social norms. That does not yet exist in the Notice and Choice literature. Notice and Choice Must Go: The Collective Control Alternative remedies that lack
Experts-in-the-Loop: Establishing an Effective Workflow in Crafting Privacy Q&A
Privacy policies play a vital role in safeguarding user privacy as legal
jurisdictions worldwide emphasize the need for transparent data processing.
While the suitability of privacy policies to enhance transparency has been
critically discussed, employing conversational AI systems presents unique
challenges in informing users effectively. In this position paper, we propose a
dynamic workflow for transforming privacy policies into privacy
question-and-answer (Q&A) pairs to make privacy policies easily accessible
through conversational AI. Thereby, we facilitate interdisciplinary
collaboration among legal experts and conversation designers, while also
considering the utilization of large language models' generative capabilities
and addressing associated challenges. Our proposed workflow underscores
continuous improvement and monitoring throughout the construction of privacy
Q&As, advocating for comprehensive review and refinement through an
experts-in-the-loop approach.Comment: Position paper presented at CONVERSATIONS 2023 - the 7th
International Workshop on Chatbot Research and Design, hosted by the
University of Oslo, Norway, November 22-23, 202
OPPO: An Ontology for Describing Fine-Grained Data Practices in Privacy Policies of Online Social Networks
Privacy policies outline the data practices of Online Social Networks (OSN)
to comply with privacy regulations such as the EU-GDPR and CCPA. Several
ontologies for modeling privacy regulations, policies, and compliance have
emerged in recent years. However, they are limited in various ways: (1) they
specifically model what is required of privacy policies according to one
specific privacy regulation such as GDPR; (2) they provide taxonomies of
concepts but are not sufficiently axiomatized to afford automated reasoning
with them; and (3) they do not model data practices of privacy policies in
sufficient detail to allow assessing the transparency of policies. This paper
presents an OWL Ontology for Privacy Policies of OSNs, OPPO, that aims to fill
these gaps by formalizing detailed data practices from OSNS' privacy policies.
OPPO is grounded in BFO, IAO, OMRSE, and OBI, and its design is guided by the
use case of representing and reasoning over the content of OSNs' privacy
policies and evaluating policies' transparency in greater detail.Comment: 14 Pages, 6 figures, Ontology Showcase and Demonstrations Track, 9th
Joint Ontology Workshops (JOWO 2023), co-located with FOIS 2023, 19-20 July,
2023, Sherbrooke, Quebec, Canad
A Look into User\u27s Privacy Perceptions and Data Practices of IoT Devices
Purpose: With the rapid deployment of Internet of Things (IoT) technologies, it has been essential to address the security and privacy issues through maintaining transparency in data practices. The prior research focused on identifying peopleâs privacy preferences in different contexts of IoT usage, and their mental models of security threats. However, there is a dearth in existing literature to understand the mismatch between userâs perceptions and the actual data practices of IoT devices. Such mismatches could lead users unknowingly sharing their private information, exposing themselves to unanticipated privacy risks. We aim to identify these mismatched privacy perceptions in our work.
Methodology: We conducted a lab study with 42 participants, where we compared participantsâ perceptions with the data practices stated in the privacy policy of 28 IoT devices from different categories, including health & exercise, entertainment, smart homes, toys & games, and pets.
Findings: We identified the mismatched privacy perceptions of users in terms of data collection, sharing, protection, and storage period. Our findings revealed the mismatches between userâs perceptions and the data practices of IoT devices for various types of information, including personal, contact, financial, heath, location, media, connected device, online social media, and IoT device usage.
Value: The findings from this study lead to our recommendations on designing simplified privacy notice by highlighting the unexpected data practices, which in turn, would contribute to the secure and privacy-preserving use of IoT devices
Dark Patterns after the GDPR: Scraping Consent Pop-ups and Demonstrating their Influence
New consent management platforms (CMPs) have been introduced to the web to
conform with the EU's General Data Protection Regulation, particularly its
requirements for consent when companies collect and process users' personal
data. This work analyses how the most prevalent CMP designs affect people's
consent choices. We scraped the designs of the five most popular CMPs on the
top 10,000 websites in the UK (n=680). We found that dark patterns and implied
consent are ubiquitous; only 11.8% meet the minimal requirements that we set
based on European law. Second, we conducted a field experiment with 40
participants to investigate how the eight most common designs affect consent
choices. We found that notification style (banner or barrier) has no effect;
removing the opt-out button from the first page increases consent by 22--23
percentage points; and providing more granular controls on the first page
decreases consent by 8--20 percentage points. This study provides an empirical
basis for the necessary regulatory action to enforce the GDPR, in particular
the possibility of focusing on the centralised, third-party CMP services as an
effective way to increase compliance.Comment: 13 pages, 3 figures. To appear in the Proceedings of CHI '20 CHI
Conference on Human Factors in Computing Systems, April 25--30, 2020,
Honolulu, HI, US
âIt becomes more of an abstract idea, this privacyââInforming the design for communal privacy experiences in smart homes
In spite of research recognizing the home as a shared space and privacy as inherently social, privacy in smart homes has mainly been researched from an individual angle. Sometimes contrasting and comparing perspectives of multiple individuals, research has rarely focused on how household members might use devices communally to achieve common privacy goals. An investigation of communal use of smart home devices and its relationship with privacy in the home is lacking. The paper presents a grounded analysis based on a synergistic relationship between an ethnomethodologically-informed (EM-informed) study and a grounded theory (GT) approach. The study focuses on household membersâ interactions to show that household membersâ ability to coordinate the everyday use of their devices depends on appropriate conceptualizations of roles, rules, and privacy that are fundamentally different from those embodied by off-the-shelf products. Privacy is rarely an explicit, actionable, and practical consideration among household members, but rather a consideration wrapped up in everyday concerns. Roles and rules are not used to create social order, but to account for it. To sensitize to this everyday perspective and to reconcile privacy as wrapped up in everyday concerns with the design of smart home systems, the paper presents the social organization of communal use as a descriptive framework. The framework is descriptive in capturing how households navigate the âmurky watersâ of communal use in practice, where prior research highlighted seemingly irreconcilable differences in interest, attitude, and aptitude between multiple individuals and with other stakeholders. Discussing how householdsâ use of roles, rules, and privacy in-practice differed from what off-the-shelf products afforded, the framework highlights critical challenges and opportunities for the design of communal privacy experiences