5,389 research outputs found

    Empowering Older Adults With Their Information Privacy Management

    Get PDF
    Literature depicts a deficit-based narrative around older adults and their technology use, suggesting that older adults are not able to keep up with their younger counterparts in adopting new technologies. In this dissertation, I argue that this view is not necessarily accurate or productive. Instead, I argue that the deficit is in the technology design, which is not inclusive and often caters to the needs of younger adults. I study older and younger adults\u27 privacy decision-making as a showcase. To study the privacy decision-making process with more granularity, I used a dual-route approach (decision heuristics and privacy calculus) to disentangle different aspects of the decision. This helps identify older and younger adults\u27 differences better. My results rebut the deficit-based narrative and show that older adults are motivated and able to manage their privacy. However, they have a different decision-making mechanism compared to younger adults. For example, older adults are more likely to make a rational decision by considering a more thorough risk/benefit trade-off than younger adults. I furthermore show that age (i.e., being older or younger adult) is only a proxy for other parameters; the different decision-making mechanisms can be justified by parameters that vary across age groups (e.g., levels of privacy literacy and privacy self-efficacy). My work introduces a new perspective in technology design and has practical implications for designing for the elderly, a population with different wants and needs

    Privacy, security, and trust issues in smart environments

    Get PDF
    Recent advances in networking, handheld computing and sensor technologies have driven forward research towards the realisation of Mark Weiser's dream of calm and ubiquitous computing (variously called pervasive computing, ambient computing, active spaces, the disappearing computer or context-aware computing). In turn, this has led to the emergence of smart environments as one significant facet of research in this domain. A smart environment, or space, is a region of the real world that is extensively equipped with sensors, actuators and computing components [1]. In effect the smart space becomes a part of a larger information system: with all actions within the space potentially affecting the underlying computer applications, which may themselves affect the space through the actuators. Such smart environments have tremendous potential within many application areas to improve the utility of a space. Consider the potential offered by a smart environment that prolongs the time an elderly or infirm person can live an independent life or the potential offered by a smart environment that supports vicarious learning

    Transparent government, not transparent citizens: a report on privacy and transparency for the Cabinet Office

    No full text
    1. Privacy is extremely important to transparency. The political legitimacy of a transparency programme will depend crucially on its ability to retain public confidence. Privacy protection should therefore be embedded in any transparency programme, rather than bolted on as an afterthought. 2. Privacy and transparency are compatible, as long as the former is carefully protected and considered at every stage. 3. Under the current transparency regime, in which public data is specifically understood not to include personal data, most data releases will not raise privacy concerns. However, some will, especially as we move toward a more demand-driven scheme. 4. Discussion about deanonymisation has been driven largely by legal considerations, with a consequent neglect of the input of the technical community. 5. There are no complete legal or technical fixes to the deanonymisation problem. We should continue to anonymise sensitive data, being initially cautious about releasing such data under the Open Government Licence while we continue to take steps to manage and research the risks of deanonymisation. Further investigation to determine the level of risk would be very welcome. 6. There should be a focus on procedures to output an auditable debate trail. Transparency about transparency – metatransparency – is essential for preserving trust and confidence. Fourteen recommendations are made to address these conclusions

    Evaluating Privacy Adaptation Presentation Methods to support Social Media Users in their Privacy-Related Decision-Making Process

    Get PDF
    Several privacy scholars have advocated for user-tailored privacy (UTP). A privacy-enhancing adaptive privacy approach to help reconcile users\u27 lack of awareness, privacy management skills and motivation to use available platform privacy features with their need for personalized privacy support in alignment with their privacy preferences. The idea behind UTP is to measure users\u27 privacy characteristics and behaviors, use these measurements to create a personalized model of the user\u27s privacy preferences, and then provide adaptive support to the user in navigating and engaging with the available privacy settings---or even implement certain settings automatically on the user\u27s behalf. To this end, most existing work on UTP has focused on the measurement\u27\u27 and algorithmic modeling\u27\u27 aspect of UTP, however, with less emphasis on the adaptation\u27\u27 aspect. More specifically, limited research efforts have been devoted to the exploration of the presentation of privacy adaptations that align with user privacy preferences. The concept of presentation\u27\u27 goes beyond the visual characteristics of the adaptation: it can profoundly impact the required level of engagement with the system and the user\u27s tendency to follow the suggested privacy adaptation. This dissertation evaluates the potential of three adaptation presentation methods in supporting social media users to make better\u27\u27 privacy protection decisions. These three adaptation presentation methods include 1) automation that involves the automatic application of the privacy settings by the system without user input to alleviate them from having to make frequent privacy decisions; 2) highlights that emphasize certain privacy features to guide users to apply the settings themselves in a subtle but useful manner; and 3) suggestions that can explicitly inform users about the availability of certain settings that can be applied directly by the user. The first study focuses on understanding user perspectives on the different configurations of autonomy and control of the examined three privacy adaptation presentation methods. A second follow-up study examines the effectiveness of these adaptation presentation methods in improving user awareness and engagement with available privacy features. Taking into account social media users\u27 privacy decision-making process (i.e., they often make privacy-related decisions), the final study assesses the impact of privacy-related affect and message framing (i.e., tone style) on users\u27 privacy decisions in adaptation-supported social media environments. We offer insights and provide practical considerations towards the selection and use of optimal\u27\u27 privacy adaptation methods to provide user-tailored privacy decision support

    The Intricate Effects of Complexity and Personalization on Investment Intention in Robo-Advisory

    Get PDF
    Amongst the tremendous transformation of the financial services industry in recent years, robo-advisory has emerged as new technology and proven its potential to digitalize this industry. Robo-advisors grant their users access to wealth management services that were historically performed manually. In doing so, robo-advisors allow personalization of investment portfolios on an unprecedented scale. Simultaneously, investment decisions are inherently complex for average users. Understanding how personalization and complexity affect users is, therefore, crucial for robo-advisors. We examine these effects in an online experiment with a fictitious robo-advisor and 169 participants. Our results show that personalization lowers users’ intention to invest, while complexity has a significant positive effect on users’ investment intentions and attenuates the negative impact of personalization. We contribute to IS research by uncovering the intricate effects of combining complexity and personalization in digital environments that will gain importance with users facing increasingly complex digital products

    Rethinking Digital Nudging: A Taxonomical Approach to Defining and Identifying Characteristics of Digital Nudging Interventions

    Get PDF
    Digital nudging interventions have emerged as soft-paternalistic mechanisms for reducing heuristic limitations and biases in decision-making environments. Prior research has conceptualized digital nudging interventions as subtle modifications in the decision-making environment that nudge a decision maker towards better choices without limiting other alternatives. The approach has received criticism as researchers have achieved limited consensus on its definition, categorized diverse behavior-modulation methodologies as digital nudging, and raised multiple ethical concerns about it. Such ambiguity reduces fidelity while challenging synthesis, application, and replication. In this paper, we posit the need to broaden the definition of digital nudging interventions, reconcile the inconsistencies, and present a coherent frame. Based on a systematic review of the extant literature, we propose an extended definition that is coherent with the libertarian-paternalistic principle, clarifying the intent of digital nudging interventions, and delineating the nature of the digital artifacts involved. We further present a taxonomy with standard vernacular and label its complex underlying principles and the components that can guide practitioners and researchers
    • 

    corecore