15,517 research outputs found
Personalization-Privacy Paradox in Using Mobile Health Services
The rapid growth of the population and the increase in life expectancy put intense pressure on the healthcare systems worldwide. Mobile health applications (m-health apps) have the potential to help ease the situation by offering highly personalized services that empower individuals to take better care of their health. To reach their full potential, m-health apps must continuously gather personal health data from users, which leads to privacy concerns. We study the influence of usersâ privacy concerns on their intention to disclose personal health data to m-health apps. Using an online survey and conducting SEM-PLS, we show that a personalization-privacy paradox is present in the context of m-health apps. While respondents claim to have privacy concerns about using m-health apps, their concerns did not negatively affect their self-disclosure intentions nor their behavioral intention. Our results show that the magnitude of personalization-privacy paradox is influenced by demographic factors especially gender and age
Recommended from our members
Disposition toward privacy and information disclosure in the context of emerging health technologies.
ObjectiveWe sought to present a model of privacy disposition and its development based on qualitative research on privacy considerations in the context of emerging health technologies.Materials and methodsWe spoke to 108 participants across 44 interviews and 9 focus groups to understand the range of ways in which individuals value (or do not value) control over their health information. Transcripts of interviews and focus groups were systematically coded and analyzed in ATLAS.ti for privacy considerations expressed by respondents.ResultsThree key findings from the qualitative data suggest a model of privacy disposition. First, participants described privacy related behavior as both contextual and habitual. Second, there are motivations for and deterrents to sharing personal information that do not fit into the analytical categories of risks and benefits. Third, philosophies of privacy, often described as attitudes toward privacy, should be classified as a subtype of motivation or deterrent.DiscussionThis qualitative analysis suggests a simple but potentially powerful conceptual model of privacy disposition, or what makes a person more or less private. Components of privacy disposition are identifiable and measurable through self-report and therefore amenable to operationalization and further quantitative inquiry.ConclusionsWe propose this model as the basis for a psychometric instrument that can be used to identify types of privacy dispositions, with potential applications in research, clinical practice, system design, and policy
The Impacts of Privacy Rules on Users' Perception on Internet of Things (IoT) Applications: Focusing on Smart Home Security Service
Department of Management EngineeringAs communication and information technologies advance, the Internet of Things (IoT) has changed the way people live. In particular, as smart home security services have been widely commercialized, it is necessary to examine consumer perception. However, there is little research that explains the general perception of IoT and smart home services. This article will utilize communication privacy management theory and privacy calculus theory to investigate how options to protect privacy affect how users perceive benefits and costs and how those perceptions affect individuals??? intentions to use of smart home service. Scenario-based experiments were conducted, and perceived benefits and costs were treated as formative second-order constructs. The results of PLS analysis in the study showed that smart home options to protect privacy decreased perceived benefits and increased perceived costs. In addition, the perceived benefits and perceived costs significantly affected the intention to use smart home security services. This research contributes to the field of IoT and smart home research and gives practitioners notable guidelines.ope
The Importance of Transparency and Willingness to Share Personal Information
This study investigates the extent to which individuals are willing to share their sensitive personal information with companies. The study examines whether skepticism can influence willingness to share information. Additionally, it seeks to determine whether transparency can moderate the relationship between skepticism and willingness to share and whether 1) companies perceived motives, 2) individualâs prior privacy violations, 3) individualsâ propensity to take risks, and 4) individuals self-efficacy act as antecedents of skepticism. Partial Least Squares (PLS) regression is used to examine the relationships between all the factors. The findings indicate that skepticism does have a negative impact on willingness to share personal information and that transparency can reduce skepticis
The control over personal data: True remedy or fairy tale ?
This research report undertakes an interdisciplinary review of the concept of
"control" (i.e. the idea that people should have greater "control" over their
data), proposing an analysis of this con-cept in the field of law and computer
science. Despite the omnipresence of the notion of control in the EU policy
documents, scholarly literature and in the press, the very meaning of this
concept remains surprisingly vague and under-studied in the face of
contemporary socio-technical environments and practices. Beyond the current
fashionable rhetoric of empowerment of the data subject, this report attempts
to reorient the scholarly debates towards a more comprehensive and refined
understanding of the concept of control by questioning its legal and technical
implications on data subject\^as agency
In/Visible Bodies. On patients and privacy in a networked world
In the networked world, privacy and visibility become entangled in new and unexpected ways. This article uses the concept of networked visibility to explore the entanglement of technology and the visibility of patient bodies. Based\ud
on semi-structured interviews with patients active in social media, this paper describes how multiple patient bodies are produced in the negotiations between the need for privacy and the need for social interaction. Information technology is actively involved in these negotiations: patients use technology to make their bodies both visible and invisible. At the same time technology collects data on these patients, which can be used for undesired commercial and surveillance\ud
purposes. The notion of visibility by design may infuse design efforts that enable online privacy, supporting patients in the multiple ways they want to be visible and invisible online
Response to Privacy as a Public Good
In the spirit of moving forward the theoretical and empirical scholarship on privacy as a public good, this response addresses four issues raised by Professors Fairfield and Engelâs article: first, their depiction of individuals in groups; second, suggestions for clarifying the concept of group; third, an explanation of why the platforms on which groups exist and interact needs more analysis; and finally, the question of what kind of government intervention might be necessary to protect privacy as a public good
Explaining the Privacy Paradox through Identifying Boundary Conditions of the Relationship between Privacy Concerns and Disclosure Behaviors
The privacy paradox phenomenon suggests that individuals tend to make privacy decisions (i.e., disclosure of personal information) that contradict their dispositional privacy concerns. Despite the emerging research attempting to explain this phenomenon, it remains unclear why the privacy paradox exists. In order to explain why it exists and to be able to predict occurrences of privacy paradoxical decisions, this dissertation emphasizes the need to identify boundary conditions of the relationship between privacy concerns and disclosure behaviors. Across three empirical research studies varying in their contexts, this dissertation presents a total of seven boundary conditions (i.e., cognitive absorption, cognitive resource depletion, positive mood state, privacy control, convenience, empathic concern, and social nudging) that can explain why privacy concerns sometimes do not predict disclosure behaviors (i.e., the privacy paradox). The approach of identifying the boundary conditions advances privacy theories by establishing a theoretically sounder causal link between privacy concerns and disclosure behaviors while contributing to enhancing privacy policies, organizational privacy practices, and individualsâ privacy decisions
UNCOVERING THE PRIVACY PARADOX: THE INFLUENCE OF DISTRACTION ON DATA DISCLOSURE DECISIONS
The discrepancy between individualsâ intention to disclose data and their actual disclosure behaviour is called the privacy paradox. Although a wide range of research has investigated the privacy para-dox, it remains insufficiently understood due to mental processesâ role in decision-making being most-ly neglected. This research-in-progress provides a theoretical concept that examines the cognitive processes underlying data disclosure decisions to provide a better understanding of the privacy para-dox. We apply the Elaboration Likelihood Model (ELM), which suggests that the mental shortcuts that individuals take when making their actual data disclosure decision, which differs from their self-reported data disclosure intention, cause the privacy paradox. We propose a two-step, mixed method approach comprising a survey and an online experiment to empirically explore the intended and actu-al data disclosure. The study takes theoretical and methodological issues in prior literature into ac-count and enhances our understanding of individualsâ paradoxical data disclosure behaviour from a psychological point of view
- âŠ