4 research outputs found
Is Privacy Controllable?
One of the major views of privacy associates privacy with the control over
information. This gives rise to the question how controllable privacy actually
is. In this paper, we adapt certain formal methods of control theory and
investigate the implications of a control theoretic analysis of privacy. We
look at how control and feedback mechanisms have been studied in the privacy
literature. Relying on the control theoretic framework, we develop a simplistic
conceptual control model of privacy, formulate privacy controllability issues
and suggest directions for possible research.Comment: The final publication will be available at Springer via
http://dx.doi.org/ [in press
The privacy paradox applies to IoT devices too:a Saudi Arabian study
The “privacy paradox” is the term used to describe the disconnect between self-reported privacy value attributions and actions actually taken to protect and preserve personal privacy. This phenomenon has been investigated in a number of domains and we extend the body of research with an investigation in the IoT domain. We presented participants with evidence of a specific IoT device’s (smart plug) privacy violations and then measured changes in privacy concerns and trust, as well as uptake of a range of behavioural responses. Our Saudi Arabian participants, despite expressing high levels of privacy concerns, generally chose not to respond to this evidence with preventative action. Most preferred to retain the functionality the smart device offered, effectively choosing to tolerate likely privacy violations. Moreover, while the improved awareness increased privacy concerns and reduced trust in the device straight after the experiment, these had regressed to pre-awareness levels a month later. Our study confirms the existence of the privacy paradox in the Saudi Arabian IoT domain, and also reveals the limited influence awareness raising exerts on long-term privacy concern and trust levels
Is It Harmful? Re-examining Privacy Concerns
Part 3: Privacy in the Era of the Smart RevolutionInternational audienceThe increased popularity of interconnected devices, which we rely on when performing day-to-day activities expose people to various privacy harms. This paper presents findings from the empirical investigation of privacy concerns. The study revealed that people, regardless of their diversity, perceive privacy harms as generic and simplified models, not individually as suggested in Solove’s framework. Additionally, the results identified differences in privacy concerns related to information disclosure, protection behavior, and demographics. The findings may benefit privacy and system designers, ensuring that policies and digital systems match people’s privacy expectations, decreasing risks and harms
The dynamics of data donation : privacy risk, mobility data, and the smart city
With the development of new technologies and their increased applications in the context of a local government, cities have started to claim that they are smart. Smart Cities make use of Information and Communication Technologies (ICTs) to support planning and policy making. For an appropriate and sustainable functioning of these smart cities, collecting data about the different aspects of their territory and operations, including its citizens, is a crucial activity. Currently, there are two main avenues in which smart cities can collect data about their citizens: either through sensors, and cameras strategically placed throughout the city or by asking citizens to voluntarily donate to the local government their personal data (i.e., citizen engagement or ‘e-participation’). Despite the growth and increasing prevalence of the latter practice, little attention has been given to how individuals experience the risks of data donation. Often, studies consider data donation as an aspect of the phenomenon of surveillance, or as a type of data sharing. This study theorises and empirically examines data donation and its risks as a phenomenon which is separate from either surveillance or data sharing.
Focusing on mobility data, this study draws on two established donation and privacy risk frameworks to investigate how the risks of donating personal data to a smart city are experienced and socially constructed. The thematic analysis of ten focus groups conducted showed that, in the context of this empirical examination, privacy-specific risks alone do not constitute constructed risks. Instead, they combine in various ways with perceived donation risks to constitute more nuanced and embedded risk constructions. Donation risks are seen as potential consequences of privacy risks and combined they constitute the risks of donating data. This thesis underlines the importance of the context under which data donation takes place as well as privacy’s value in a free and democratic society."This work was supported by the University of St Andrews and the Social Sciences and
Humanities Research Council of Canada (SSHRC) under the ‘Big Data Surveillance’
partnership grant. The grant’s reference is: SSHRC 895-2015-1003" -- Fundin