10 research outputs found

    Mobile recommender apps with privacy management for accessible and usable technologies

    Get PDF
    The paper presents the preliminary results of an ongoing survey of the use of computers and mobile devices, interest in recommender apps and knowledge and concerns about privacy issues amongst English and Italian speaking disabled people. Participants were found to be regular users of computers and mobile devices for a range of applications. They were interested in recommender apps for household items, computer software and apps that met their accessibility and other requirements. They showed greater concerns about controlling access to personal data of different types than this data being retained by the computer or mobile device. They were also willing to make tradeoffs to improve device performance

    Security by behavioural design: a rapid review

    Get PDF
    Security and Global AffairsCybersecurity en cybergovernanc

    After Over-Privileged Permissions: Using Technology and Design to Create Legal Compliance

    Get PDF
    Consumers in the mobile ecosystem can putatively protect their privacy with the use of application permissions. However, this requires the mobile device owners to understand permissions and their privacy implications. Yet, few consumers appreciate the nature of permissions within the mobile ecosystem, often failing to appreciate the privacy permissions that are altered when updating an app. Even more concerning is the lack of understanding of the wide use of third-party libraries, most which are installed with automatic permissions, that is permissions that must be granted to allow the application to function appropriately. Unsurprisingly, many of these third-party permissions violate consumers’ privacy expectations and thereby, become “over-privileged” to the user. Consequently, an obscurity of privacy expectations between what is practiced by the private sector and what is deemed appropriate by the public sector is exhibited. Despite the growing attention given to privacy in the mobile ecosystem, legal literature has largely ignored the implications of mobile permissions. This article seeks to address this omission by analyzing the impacts of mobile permissions and the privacy harms experienced by consumers of mobile applications. The authors call for the review of industry self-regulation and the overreliance upon simple notice and consent. Instead, the authors set out a plan for greater attention to be paid to socio-technical solutions, focusing on better privacy protections and technology embedded within the automatic permission-based application ecosystem

    Bolder is better : raising user awareness through salient and concise privacy notices

    Get PDF
    This paper addresses the question whether the recently proposed approach of concise privacy notices in apps and on websites is effective in raising user awareness. To assess the effectiveness in a realistic setting, we included concise notices in a fictitious but realistic fitness tracking app and asked participants recruited from an online panel to provide their feedback on the usability of the app as a cover story. Importantly, after giving feedback, users were also asked to recall the data practices described in the notices. The experimental setup included the variation of different levels of saliency and riskiness of the privacy notices. Based on a total sample of 2,274 participants, our findings indicate that concise privacy notices are indeed a promising approach to raise user awareness for privacy information when displayed in a salient way, especially in case the notices describe risky data practices. Our results may be helpful for regulators, user advocates and transparency-oriented companies in creating or enforcing better privacy transparency towards average users that do not read traditional privacy policies

    Return on Data: Personalizing Consumer Guidance in Data Exchanges

    Get PDF
    Consumers routinely supply personal data to technology companies in exchange for services. Yet, the relationship between the utility (U) consumers gain and the data (D) they supply — “return on data” (ROD) — remains largely unexplored. Expressed as a ratio, ROD = U / D. While lawmakers strongly advocate protecting consumer privacy, they tend to overlook ROD. Are the benefits of the services enjoyed by consumers, such as social networking and predictive search, commensurate with the value of the data extracted from them? How can consumers compare competing data-for-services deals? Currently, the legal frameworks regulating these transactions, including privacy law, aim primarily to protect personal data

    Evaluating Privacy Adaptation Presentation Methods to support Social Media Users in their Privacy-Related Decision-Making Process

    Get PDF
    Several privacy scholars have advocated for user-tailored privacy (UTP). A privacy-enhancing adaptive privacy approach to help reconcile users\u27 lack of awareness, privacy management skills and motivation to use available platform privacy features with their need for personalized privacy support in alignment with their privacy preferences. The idea behind UTP is to measure users\u27 privacy characteristics and behaviors, use these measurements to create a personalized model of the user\u27s privacy preferences, and then provide adaptive support to the user in navigating and engaging with the available privacy settings---or even implement certain settings automatically on the user\u27s behalf. To this end, most existing work on UTP has focused on the measurement\u27\u27 and algorithmic modeling\u27\u27 aspect of UTP, however, with less emphasis on the adaptation\u27\u27 aspect. More specifically, limited research efforts have been devoted to the exploration of the presentation of privacy adaptations that align with user privacy preferences. The concept of presentation\u27\u27 goes beyond the visual characteristics of the adaptation: it can profoundly impact the required level of engagement with the system and the user\u27s tendency to follow the suggested privacy adaptation. This dissertation evaluates the potential of three adaptation presentation methods in supporting social media users to make better\u27\u27 privacy protection decisions. These three adaptation presentation methods include 1) automation that involves the automatic application of the privacy settings by the system without user input to alleviate them from having to make frequent privacy decisions; 2) highlights that emphasize certain privacy features to guide users to apply the settings themselves in a subtle but useful manner; and 3) suggestions that can explicitly inform users about the availability of certain settings that can be applied directly by the user. The first study focuses on understanding user perspectives on the different configurations of autonomy and control of the examined three privacy adaptation presentation methods. A second follow-up study examines the effectiveness of these adaptation presentation methods in improving user awareness and engagement with available privacy features. Taking into account social media users\u27 privacy decision-making process (i.e., they often make privacy-related decisions), the final study assesses the impact of privacy-related affect and message framing (i.e., tone style) on users\u27 privacy decisions in adaptation-supported social media environments. We offer insights and provide practical considerations towards the selection and use of optimal\u27\u27 privacy adaptation methods to provide user-tailored privacy decision support
    corecore