7,275 research outputs found
EXPLORING THE IMPACT OF READABILITY OF PRIVACY POLICIES ON USERS’ TRUST
Empirical studies have repeatedly pointed out that the readability of a privacy policy is a potential source of trust of online users. Nevertheless, many online companies still keep the readability of their privacy policies at a low level. This could possibly coincide with a low compliance of their privacy policies with the guidelines of fair information practices and thus with users’ privacy expectations. Against this background, this study seeks to clarify the role of perceived and actual readability of us-er-friendly and -unfriendly privacy policies in shaping user’s trust in a mobile service provider. Tested for two different mobile service scenarios that differ in the sensitivity of user data (educational enter-tainment app vs. health app), our hypotheses are verified based on the responses of 539 online users. Our findings reveal that in the case of a user-unfriendly data-handling policy, the effect of actual readability of a privacy policy outweighs the effect of its perceived readability in forming users’ trust. At the same time, for a user-friendly privacy policy, only perceived readability plays a significant role in promoting users’ trust in the provider of an educational entertainment app. In a sensitive healthcare context, however, perceived and actual readability of privacy policies are almost equally important
Privacy Policies and Users’ Trust: Does Readability Matter?
Over the years, a drastic increase in online information disclosure spurs a wave of concerns from multiple stakeholders. Among others, users resent the “behind the closed doors” processing of their personal data by companies. Privacy policies are supposed to inform users how their personal information is handled by a website. However, several studies have shown that users rarely read privacy policies for various reasons, not least because limitedly readable policy texts are difficult to understand. Based on our online survey with over 440 responses, we examine the objective and subjective readability of privacy policies and investigate their impact on users’ trust in five big Internet services. Our findings show the stronger a user believes in having understood the privacy policy, the higher he or she trusts a web site across all companies we studied. Our results call for making readability of privacy policies more accessible to an average reader
How to make privacy policies both GDPR-compliant and usable
It is important for organisations to ensure that their privacy policies are General Data Protection Regulation (GDPR) compliant, and this has to be done by the May 2018 deadline. However, it is also important for these policies to be designed with the needs of the human recipient in mind. We carried out an investigation to find out how best to achieve this.We commenced by synthesising the GDPR requirements into a checklist-type format. We then derived a list of usability design guidelines for privacy notifications from the research literature. We augmented the recommendations with other findings reported in the research literature, in order to confirm the guidelines. We conclude by providing a usable and GDPR-compliant privacy policy template for the benefit of policy writers
Do Mobile App Providers Try Enough to Protect Users’ Privacy? – a Content Analysis of Mobile App Privacy Policies
Privacy policies are widely used to draw clear image of risks to users’ personal information in different contexts such as mobile apps. Nonetheless, many believe privacy policies are ineffective tools to notify and aware users about possible risks to information privacy merely because most users have a very low tendency to go through privacy policies to read and comprehend them. Due to intimacy of mobile apps, much of personal information disclosed to them are at risk. Specially, when mobile app users share sensitive personal information to apps chance of privacy violation and consequent risks are higher. It is not only important to understand how mobile developers practically implement a contract to protect users’ privacy based on users’ preferences but also crucial to examine the role of sensitivity of information on developers’ emphasis on different aspects of privacy.
This research focuses on two aspects to understand the circumstance users experience when privacy policies are presented: efforts users have to make to read and understand privacy policies in terms of readability and length of statements, and developers’ emphasis on aspects of information privacy with respect to sensitivity of information. To elucidate easiness of reading privacy policy statements, readability and length are calculated. Through the lens of framing concept of prospect theory, this study investigates the information sensitivity level effect on developers’ emphasis on privacy dimensions. Three mobile app categories deal with different levels of sensitive data are health, navigation, and game apps. To differentiate between emphasis on different privacy dimensions when information sensitivity differs, a text mining method is developed in R to analyze the weights of four key privacy dimensions (collection, secondary use, improper access, and error).
We downloaded 90 unique mobile app privacy policies. Readability calculations reveal that users should have a minimum of 12 years of secondary education to easily understand privacy policies. The average length of privacy policies is at least 1900 words, which hinders a thorough reading. ANOVA results show a significant difference between secondary uses of information in app privacy policies dealing with higher sensitive data. In addition, the findings demonstrate collection is more emphasized in health than game app privacy policies but do not find any significant difference between improper access dimensions. This study has made two key contributions. First, by building upon the framing concept of prospect theory, this research provides an effective framework to understand the organizational perspective of privacy concerns. Second, the results demonstrate the information sensitivity level is important for measuring privacy concerns
An Automated Approach to Auditing Disclosure of Third-Party Data Collection in Website Privacy Policies
A dominant regulatory model for web privacy is "notice and choice". In this
model, users are notified of data collection and provided with options to
control it. To examine the efficacy of this approach, this study presents the
first large-scale audit of disclosure of third-party data collection in website
privacy policies. Data flows on one million websites are analyzed and over
200,000 websites' privacy policies are audited to determine if users are
notified of the names of the companies which collect their data. Policies from
25 prominent third-party data collectors are also examined to provide deeper
insights into the totality of the policy environment. Policies are additionally
audited to determine if the choice expressed by the "Do Not Track" browser
setting is respected.
Third-party data collection is wide-spread, but fewer than 15% of attributed
data flows are disclosed. The third-parties most likely to be disclosed are
those with consumer services users may be aware of, those without consumer
services are less likely to be mentioned. Policies are difficult to understand
and the average time requirement to read both a given site{\guillemotright}s
policy and the associated third-party policies exceeds 84 minutes. Only 7% of
first-party site policies mention the Do Not Track signal, and the majority of
such mentions are to specify that the signal is ignored. Among third-party
policies examined, none offer unqualified support for the Do Not Track signal.
Findings indicate that current implementations of "notice and choice" fail to
provide notice or respect choice
Privacy Issues of the W3C Geolocation API
The W3C's Geolocation API may rapidly standardize the transmission of
location information on the Web, but, in dealing with such sensitive
information, it also raises serious privacy concerns. We analyze the manner and
extent to which the current W3C Geolocation API provides mechanisms to support
privacy. We propose a privacy framework for the consideration of location
information and use it to evaluate the W3C Geolocation API, both the
specification and its use in the wild, and recommend some modifications to the
API as a result of our analysis
Annotating Privacy Policies in the Sharing Economy
Applications (apps) of the Digital Sharing Economy (DSE), such as Uber,
Airbnb, and TaskRabbit, have become a main enabler of economic growth and
shared prosperity in modern-day societies. However, the complex exchange of
goods, services, and data that takes place over these apps frequently puts
their end-users' privacy at risk. Privacy policies of DSE apps are provided to
disclose how private user data is being collected and handled. However, in
reality, such policies are verbose and difficult to understand, leaving DSE
users vulnerable to privacy intrusive practices. To address these concerns, in
this paper, we propose an automated approach for annotating privacy policies in
the DSE market. Our approach identifies data collection claims in these
policies and maps them to the quality features of their apps. Visual and
textual annotations are then used to further explain and justify these claims.
The proposed approach is evaluated with 18 DSE app users. The results show that
annotating privacy policies can significantly enhance their comprehensibility
to the average DSE user. Our findings are intended to help DSE app developers
to draft more comprehensible privacy policies as well as help their end-users
to make more informed decisions in one of the fastest growing software
ecosystems in the world
Efficacy of Privacy Assurance Mechanisms in the Context of Disclosing Health Information Online
Privacy policy statements and privacy-assurance cues are among the most important website features that online providers could use to alleviate web customers’ privacy concerns. This study examines the moderating role of privacy concern on how privacy assurance cues and argument quality contribute to increased trust, and the subsequent decision to disclose health information online. This study has both theoretical and managerial contributions. The results provide insight about the dual roles of privacy policy statements, and privacy assurance and trust cues. The study highlights the differential impacts such mechanisms have on high privacy concerned and low privacy concerned web users in the context of disclosure of health information online
Big Data Privacy Context: Literature Effects On Secure Informational Assets
This article's objective is the identification of research opportunities in
the current big data privacy domain, evaluating literature effects on secure
informational assets. Until now, no study has analyzed such relation. Its
results can foster science, technologies and businesses. To achieve these
objectives, a big data privacy Systematic Literature Review (SLR) is performed
on the main scientific peer reviewed journals in Scopus database. Bibliometrics
and text mining analysis complement the SLR. This study provides support to big
data privacy researchers on: most and least researched themes, research
novelty, most cited works and authors, themes evolution through time and many
others. In addition, TOPSIS and VIKOR ranks were developed to evaluate
literature effects versus informational assets indicators. Secure Internet
Servers (SIS) was chosen as decision criteria. Results show that big data
privacy literature is strongly focused on computational aspects. However,
individuals, societies, organizations and governments face a technological
change that has just started to be investigated, with growing concerns on law
and regulation aspects. TOPSIS and VIKOR Ranks differed in several positions
and the only consistent country between literature and SIS adoption is the
United States. Countries in the lowest ranking positions represent future
research opportunities.Comment: 21 pages, 9 figure
- …