22 research outputs found
Segmenting travelers based on responses to nudging for information disclosure
Digital technologies shape travel environments. Noticing online privacy issues, consumers can hold distinct attitudes towards disclosing personal information to service providers. We conducted a panel survey to gauge travelers’ willingness to share personal information with service providers, provided with different types of nudges. Based on the results of clustering analysis, two segments were identified: travelers who are reasonably willing to share (Privacy Rationalists) and those who are reluctant to share (Privacy Pessimists). This study provides empirical evidence of privacy segmentations in the travel context, which has not been reported before and thus deserves more attention from both researchers and practitioners
Would a privacy fundamentalist sell their DNA for $1000... if nothing bad happened as a result? The Westin categories, behavioral intentions, and consequences
<p>Westin’s Privacy Segmentation Index has been widely used to measure privacy attitudes and categorize individuals into three privacy groups: fundamentalists, pragmatists, and unconcerned. Previous research has failed to establish a robust correlation between the Westin categories and actual or intended behaviors. Unexplored however is the connection between the Westin categories and individuals’ responses to the consequences of privacy behaviors. We use a survey of 884 Amazon Mechanical Turk participants to investigate the relationship between the Westin Privacy Segmentation Index and attitudes and behavioral intentions for both privacysensitive scenarios and privacy-sensitive consequences. Our results indicate a lack of correlation between the Westin categories and behavioral intent, as well as a lack of correlation between the Westin categories and consequences. We discuss potential implications of this attitude-consequence gap</p
Verbraucherorientierter Datenschutz: Identifizierung von Verbraucherarchetypen zur effektiven Kommunikation von Datenschutzpraktiken
Datenschutzkommunikation wird nur dann funktionieren, wenn die InformationsbedĂĽrfnisse der Verbraucher, die weder statisch noch einheitlich sind, adressiert werden. Ein vielversprechender, praktisch realisierbarer Ansatz ist es, die Kommunikation an Verbraucherarchetypen anzupassen. Diese Studie identifiziert die verschiedenen Archetypen basierend auf einer Webumfrage. Die identifizierten Archetypen liefern eine solide Grundlage fĂĽr die Verwirklichung funktionierender Datenschutzkommunikation
User Archetypes for Effective Information Privacy Communication
In an information systems context, information privacy communication will only work if information systems meet the information needs of their users. Since the needs are neither static nor uniform, a promising approach avoiding inadequacies of ignoring differences in users’ information needs and more practical than dedicated attention to each individual user is to target information privacy communication to user archetypes. To identify such archetypes, we conduct a survey eliciting users’ information needs and apply hierarchical clustering to derive a hierarchical model of user archetypes with respect to their information privacy information needs. We identify a total of 13 archetypes on two hierarchy levels. In contrast to extant research on information privacy user archetypes focusing on information privacy attitudes, the identified information privacy user archetypes are based on information system characteristics desired by users as elicited through our survey. Thus, they yield clear input for enhancing information system design with respect to information privacy. Our research highlights differences and similarities between archetypes and enriches it with an interpretatively derived characterization of the different archetypes. The resulting archetype hierarchy serves as foundation for future research aiming to improve communication of information privacy practices
Android Permissions Remystified: A Field Study on Contextual Integrity
Due to the amount of data that smartphone applications can potentially
access, platforms enforce permission systems that allow users to regulate how
applications access protected resources. If users are asked to make security
decisions too frequently and in benign situations, they may become habituated
and approve all future requests without regard for the consequences. If they
are asked to make too few security decisions, they may become concerned that
the platform is revealing too much sensitive information. To explore this
tradeoff, we instrumented the Android platform to collect data regarding how
often and under what circumstances smartphone applications are accessing
protected resources regulated by permissions. We performed a 36-person field
study to explore the notion of "contextual integrity," that is, how often are
applications accessing protected resources when users are not expecting it?
Based on our collection of 27 million data points and exit interviews with
participants, we examine the situations in which users would like the ability
to deny applications access to protected resources. We found out that at least
80% of our participants would have preferred to prevent at least one permission
request, and overall, they thought that over a third of requests were invasive
and desired a mechanism to block them