692 research outputs found

    Risk compensation behaviors on cascaded security choices

    Get PDF
    Organizations are interested in improving information security and make use of a range of technical, organizational, or behavioral measures. The different approaches to improving information security must not be viewed as being isolated, instead, different measures might influence each other. Security efforts fail when technical measures influence human behavior in a way that their security perceptions and behaviors are altered to the disadvantage of the security outcome. Those unintended consequences of information security practices can be classified as risk compensation behaviors, describing how users become more careless when they perceive some level of protection. This research in progress is interested in understanding risk compensation behaviors for cascaded security choices by different actors (e.g., security decisions made by organizations vs. decisions made by individuals) and presents a lab experiment to test this issue

    The Effects of Physiological Arousal and Message Framing on Fitness App Users’ Privacy Decisions

    Get PDF
    Privacy issues are becoming prevalent in users’ fitness app usage and hence gaining great attention from users and policymakers. A typical example is inappropriate authorization of access to app data. Yet, it is not clear what factors will influence users’ third-party authorization. Specifically, users’ situational states are rarely considered. This study thus investigates how an important situational state, i.e., physiological arousal, affects users’ decisions of authorizing private data in fitness apps to SNS. We concurrently examine a factor of the decision context, i.e., message framing, a design heuristic to nudge people’s privacy decisions. We hypothesize that both high physiological arousal and loss-framed message increase users’ likelihood to grant third-party authorization, and there is a positive interaction between the two factors. We plan to conduct an experiment to test the hypotheses

    Framing as an App-Design Measure to Nudge Users Toward Infection Disclosure in Contact-Tracing Applications

    Get PDF
    Contact-tracing applications are only effective in countering current and future pandemics when A) they are widely adopted, and B) users voluntarily disclose their infection to warn others. While much research has investigated how contact-tracing applications should be designed and promoted to motivate app-adoption, little is known about how to increase voluntary infection disclosures. To increase the voluntary infection disclosure among app-users, our joint research project with the core development team of a contact-tracing application, relied on the theory of message framing to investigate how to nudge users toward infection disclosure in contact-tracing applications. Based on a mixed method research design consisting of 15 workshops with the core development team of the contact-tracing application and a conjoint study among 139 users of a European contact-tracing application we show that message framing can be a useful approach to increase the voluntary infection disclosure in contact-tracing applications

    Will Security and Privacy Updates Affect Users’ Privacy Choices of Mobile Apps

    Get PDF
    There is a growing emphasis among users on safeguarding personal privacy and authorization for applications. To address this, Security and Privacy Updates (SPU) are employed to bolster app security, alleviate user apprehensions regarding security, and encourage users to share data and permissions with greater confidence. Based on the Protection Motivation Theory (PMT), we propose that SPU, an IT technology itself, has a dual effect on users’ privacy choices, security threat susceptibility and security response efficacy are the two key mediators to explain this phenomenon, and that this influencing process will be moderated by user’s privacy trade-off. We will investigate this process through a set of online experiments

    Security by behavioural design: a rapid review

    Get PDF
    Security and Global AffairsCybersecurity en cybergovernanc

    Privacy Risks in Digital Markets: The Impact of Ambiguity Attitudes on Transparency Choices

    Get PDF
    Transparency is viewed as an essential prerequisite for consumers to make informed privacy decisions in digital markets. However, it remains an open research question whether and when individuals actually prefer transparency about privacy risks when given a chance to avoid it. We investigate this question with a randomized controlled online experiment based on an Ellsberg-type design, where subjects repeatedly choose between risk and ambiguity while facing the threat of an actual disclosure of their personal data. We find empirical support for ambiguity attitudes as a novel behavioral mechanism underlying people\u27s transparency choices in privacy contexts. In particular, we find that most individuals avoid ambiguity and prefer transparency for low likelihood privacy losses. However, this pattern reverses for high likelihood losses and when subjects perceive data disclosure as a gain. Most notably, a significant share of people seek ambiguity and thus prefer to avoid transparency when facing high likelihood privacy risks

    THE EFFECT OF REGULATORY MEASURES ON INDIVIDUAL DATA DISCLOSURE: A COUNTRY COMPARISON

    Get PDF
    Data protection regulations are various. While research has found regulations to be an effective means for privacy in some data disclosure contexts, the ineffectiveness of most cookie banners is a negative example. Therefore, we investigate which contextual factors cause these differing results. Over a series of workshops with legal experts, we identified two main types of regulatory measures: measures with and without user action. Based on dual-process theories, we propose in this research-in-progress that regulations should involve users only in privacy decisions in high-effort contexts and should mandate privacy in low-effort contexts. To investigate this proposition, we plan to conduct a scenario-based survey where participants from six countries disclose data in a high- versus a low-effort context. By applying a hierarchical linear model, we want to group individual differences by regulatory differences of countries. The expected results will show whether effects of different regulatory measures on individual behavior are context-dependent

    Using Active Privacy Transparency to Mitigate the Tension Between Data Access and Consumer Privacy

    Get PDF
    Recently, news exposure about privacy practices has brought substantial negative effects on companies’ reputation and trust, which, in essence, reflects the escalating tension between data access and privacy protection that companies are currently facing. Accordingly, we design an active privacy transparency measure and implement it on our self-developed app. Through a two-task experiment, we simultaneously explore the profound and immediate effects of privacy transparency on firms and the underlying mechanisms. Results from our analyses show that active privacy transparency significantly mitigates users perceived psychological contract violations, which in turn helps companies prevent negative word-of-mouth and loss of trust. Moreover, it also ensures companies’ immediate access to user data, and the moderating role of privacy literacy provides an explanation for this insignificant effect and previous inconsistent findings. More interestingly, we find that active privacy transparency might better elicit users’ actual privacy preferences and help companies identify their targeted users

    Analysing the Influence of Loss-Gain Framing on Data Disclosure Behaviour: A Study on the Use Case of App Permission Requests

    Get PDF
    peer reviewedThis paper examines the effect of the dark pattern strategy ``loss-gain framing'' on users' data disclosure behaviour in mobile settings. Understanding whether framing influences users' willingness to disclose personal information is important to (i) determine if and how this technique can subvert consent and other privacy decisions, (ii) prevent abuse with appropriate policies and sanctions, and (iii) provide clear evidence-based guidelines for app privacy engineering. We conducted an online user study (N=848), in which we varied the framing of app permission requests (i.e., positive, negative, or neutral framing) and examined its impact on participants' willingness to accept the permission, their evaluation of the trustworthiness of the request and their perception of being informed by it. Our findings reveal effects on disclosure behaviour for request types that users cannot easily understand. In this case, negative framing makes users more likely to disclose personal information. Contrary to our expectations, positive framing reduces disclosure rates, possibly because it raises users' suspicion. We discuss implications for the design of interfaces that aim to facilitate informed, privacy-enhancing decision-making.R-AGR-3974 - C20/IS/14717072/DECEPTICON (01/06/2021 - 31/05/2024) - LENZINI Gabriele16. Peace, justice and strong institution

    Return on Data: Personalizing Consumer Guidance in Data Exchanges

    Get PDF
    Consumers routinely supply personal data to technology companies in exchange for services. Yet, the relationship between the utility (U) consumers gain and the data (D) they supply — “return on data” (ROD) — remains largely unexplored. Expressed as a ratio, ROD = U / D. While lawmakers strongly advocate protecting consumer privacy, they tend to overlook ROD. Are the benefits of the services enjoyed by consumers, such as social networking and predictive search, commensurate with the value of the data extracted from them? How can consumers compare competing data-for-services deals? Currently, the legal frameworks regulating these transactions, including privacy law, aim primarily to protect personal data
    • …
    corecore