15 research outputs found

    CO-CREATING VALUE FROM ELECTRIC VEHICLE DIGITAL SERVICES: EFFECT OF PERCEIVED ENVIRONMENTAL PERFORMANCE ON PERSONAL DATA SHARING

    Get PDF
    Electric vehicles (EVs) are considered a product rather than a mobility service. The expansion of EV services requires unlimited personal information. Due to privacy concerns, EV users may deactivate these services, leading to the discontinuance of EV services and value destruction. Service dominant logic is improving insight into the roles of resources (e.g., personal data) and institutions (e.g., concerns) in influencing individual behaviours in value co-creation (e.g., disclosing personal data). This logic is a metaperspective that needs to be integrated with mid-range theories (e.g., the enhanced antecedents–privacy concerns–outcomes) to investigate how resources, institutions and value co-creation are integrated and how to quantify this integration. This study conducted an online survey of EV users. The findings show that privacy awareness, environmental concerns and privacy concerns influence privacy calculus, but perceived environmental performance has more influence than privacy risks on intentions to continue disclosing personal data for EV services

    A UX Approach to Privacy and Security: the Impact of User, Contextual and System-Related Factors

    Get PDF
    This position paper lays out current and future studies which we conduct on the UX aspects of security and privacy, our goal being to understand which factors influence privacy-related decision-making. We advocate using UX design methods in order to study interindividual differences, system-related and contextual factors involved in privacy and security attitudes and behaviors. These results will contribute to user-tailored and personalized privacy initiatives and guide the design of future technologies

    Digital Health Innovation: Exploring Adoption of COVID-19 Digital Contact Tracing Apps

    Get PDF
    With the outbreak of COVID-19, contact tracing is becoming a used intervention to control the spread of this highly infectious disease. This article explores an individual's intention to adopt COVID-19 digital contact tracing (DCT) apps. A conceptual framework developed for this article combines the procedural fairness theory, dual calculus theory, protection motivation theory, theory of planned behavior, and Hofstede's cultural dimension theory. The study adopts a quantitative approach collecting data from 714 respondents using a random sampling technique. The proposed model is tested using structural equation modeling. Empirical results found that the perceived effectiveness of privacy policy negatively influenced privacy concerns, whereas perceived vulnerability had a positive influence. Expected personal and community-related outcomes of sharing information positively influenced attitudes toward DCT apps, while privacy concerns had a negative effect. The intention to adopt DCT apps were positively influenced by attitude, subjective norms, and privacy self-efficacy. This article is the first to empirically test the adoption of DCT apps of the COVID-19 pandemic and contributes both theoretically and practically toward understanding factors influencing its widespread adoption

    The Users' Perspective on the Privacy-Utility Trade-offs in Health Recommender Systems

    Full text link
    Privacy is a major good for users of personalized services such as recommender systems. When applied to the field of health informatics, privacy concerns of users may be amplified, but the possible utility of such services is also high. Despite availability of technologies such as k-anonymity, differential privacy, privacy-aware recommendation, and personalized privacy trade-offs, little research has been conducted on the users' willingness to share health data for usage in such systems. In two conjoint-decision studies (sample size n=521), we investigate importance and utility of privacy-preserving techniques related to sharing of personal health data for k-anonymity and differential privacy. Users were asked to pick a preferred sharing scenario depending on the recipient of the data, the benefit of sharing data, the type of data, and the parameterized privacy. Users disagreed with sharing data for commercial purposes regarding mental illnesses and with high de-anonymization risks but showed little concern when data is used for scientific purposes and is related to physical illnesses. Suggestions for health recommender system development are derived from the findings.Comment: 32 pages, 12 figure

    Privacy as a Cultural Phenomenon

    Get PDF
    Privacy remains both contentious and ever more pertinent in contemporary society. Yet it persists as an ill-defined term, not only within specific fields but in its various uses and implications between and across technical, legal and political contexts. This article offers a new critical review of the history of privacy in terms of two dominant strands of thinking: freedom and property. These two conceptions of privacy can be seen as successive historical epochs brought together under digital technologies, yielding increasingly complex socio-technical dilemmas. By simplifying the taxonomy to its socio-cultural function, the article provides a generalisable, interdisciplinary approach to privacy. Drawing on new technologies, historical trends, sociological studies and political philosophy, the article presents a discussion of the value of privacy as a term, before proposing a defense of the term cyber security as a mode of scalable cognitive privacy that integrates the relative needs of individuals, governments and corporations

    Recommending Privacy Settings for Internet-of-Things

    Get PDF
    Privacy concerns have been identified as an important barrier to the growth of IoT. These concerns are exacerbated by the complexity of manually setting privacy preferences for numerous different IoT devices. Hence, there is a demand to solve the following, urgent research question: How can we help users simplify the task of managing privacy settings for IoT devices in a user-friendly manner so that they can make good privacy decisions? To solve this problem in the IoT domain, a more fundamental understanding of the logic behind IoT users’ privacy decisions in different IoT contexts is needed. We, therefore, conducted a series of studies to contextualize the IoT users’ decision-making characteristics and designed a set of privacy-setting interfaces to help them manage their privacy settings in various IoT contexts based on the deeper understanding of users’ privacy decision behaviors. In this dissertation, we first present three studies on recommending privacy settings for different IoT environments, namely general/public IoT, household IoT, and fitness IoT, respectively. We developed and utilized a “data-driven” approach in these three studies—We first use statistical analysis and machine learning techniques on the collected user data to gain the underlying insights of IoT users’ privacy decision behavior and then create a set of “smart” privacy defaults/profiles based on these insights. Finally, we design a set of interfaces to incorporate these privacy default/profiles. Users can apply these smart defaults/profiles by either a single click or by answering a few related questions. The biggest limitation of these three studies is that the proposed interfaces have not been tested, so we do not know what level of complexity (both in terms of the user interface and the in terms of the profiles) is most suitable. Thus, in the last study, we address this limitation by conducting a user study to evaluate the new interfaces of recommending privacy settings for household IoT users. The results show that our proposed user interfaces for setting household IoT privacy settings can improve users’ satisfaction. Our research can benefit IoT users, manufacturers, and researchers, privacy-setting interface designers and anyone who wants to adopt IoT devices by providing interfaces that put their most prominent concerns in the forefront and that make it easier to set settings that match their preferences

    Privacy as a Cultural Phenomenon

    Full text link

    Investigating Obfuscation as a Tool to Enhance Photo Privacy on Social Networks Sites

    Get PDF
    Photos which contain rich visual information can be a source of privacy issues. Some privacy issues associated with photos include identification of people, inference attacks, location disclosure, and sensitive information leakage. However, photo privacy is often hard to achieve because the content in the photos is both what makes them valuable to viewers, and what causes privacy concerns. Photo sharing often occurs via Social Network Sites (SNSs). Photo privacy is difficult to achieve via SNSs due to two main reasons: first, SNSs seldom notify users of the sensitive content in their photos that might cause privacy leakage; second, the recipient control tools available on SNSs are not effective. The only solution that existing SNSs (e.g., Facebook, Flickr) provide is control over who receives a photo. This solution allows users to withhold the entire photo from certain viewers while sharing it with other viewers. The idea is that if viewers cannot see a photo, then privacy risk is minimized. However, withholding or self-censoring photos is not always the solution people want. In some cases, people want to be able to share photos, or parts of photos, even when they have privacy concerns about the photo. To provide better online photo privacy protection options for users, we leverage a behavioral theory of privacy that identifies and focuses on two key elements that influence privacy -- information content and information recipient. This theory provides a vocabulary for discussing key aspects of privacy and helps us organize our research to focus on the two key parameters through a series of studies. In my thesis, I describe five studies I have conducted. First, I focus on the content parameter to identify what portions of an image are considered sensitive and therefore are candidates to be obscured to increase privacy. I provide a taxonomy of content sensitivity that can help designers of photo-privacy mechanisms understand what categories of content users consider sensitive. Then, focusing on the recipient parameter, I describe how elements of the taxonomy are associated with users\u27 sharing preferences for different categories of recipients (e.g., colleagues vs. family members). Second, focusing on controlling photo content disclosure, I invented privacy-enhancing obfuscations and evaluated their effectiveness against human recognition and studied how they affect the viewing experience. Third, after discovering that avatar and inpainting are two promising obfuscation methods, I studied whether they were robust when de-identifying both familiar and unfamiliar people since viewers are likely to know the people in OSN photos. Additionally, I quantified the prevalence of self-reported photo self-censorship and discovered that privacy-preserving obfuscations might be useful for combating photo self-censorship. Gaining sufficient knowledge from the studies above, I proposed a privacy-enhanced photo-sharing interface that helps users identify the potential sensitive content and provides obfuscation options. To evaluate the interface, I compared the proposed obfuscation approach with the other two approaches – a control condition that mimics the current Facebook photo-sharing interface and an interface that provides a privacy warning about potentially sensitive content. The results show that our proposed system performs better over the other two in terms of reducing perceived privacy risks, increasing willingness to share, and enhancing usability. Overall, our research will benefit privacy researchers, online social network designers, policymakers, computer vision researchers, and anyone who has or wants to share photos online
    corecore