9 research outputs found

    Third Party Tracking in the Mobile Ecosystem

    Full text link
    Third party tracking allows companies to identify users and track their behaviour across multiple digital services. This paper presents an empirical study of the prevalence of third-party trackers on 959,000 apps from the US and UK Google Play stores. We find that most apps contain third party tracking, and the distribution of trackers is long-tailed with several highly dominant trackers accounting for a large portion of the coverage. The extent of tracking also differs between categories of apps; in particular, news apps and apps targeted at children appear to be amongst the worst in terms of the number of third party trackers associated with them. Third party tracking is also revealed to be a highly trans-national phenomenon, with many trackers operating in jurisdictions outside the EU. Based on these findings, we draw out some significant legal compliance challenges facing the tracking industry.Comment: Corrected missing company info (Linkedin owned by Microsoft). Figures for Microsoft and Linkedin re-calculated and added to Table

    Return on Data: Personalizing Consumer Guidance in Data Exchanges

    Get PDF
    Consumers routinely supply personal data to technology companies in exchange for services. Yet, the relationship between the utility (U) consumers gain and the data (D) they supply — “return on data” (ROD) — remains largely unexplored. Expressed as a ratio, ROD = U / D. While lawmakers strongly advocate protecting consumer privacy, they tend to overlook ROD. Are the benefits of the services enjoyed by consumers, such as social networking and predictive search, commensurate with the value of the data extracted from them? How can consumers compare competing data-for-services deals? Currently, the legal frameworks regulating these transactions, including privacy law, aim primarily to protect personal data

    PARROT: Interactive privacy-aware internet of things application design tool

    Get PDF
    Internet of Things (IoT) applications typically collect and analyse personal data that is categorised as sensitive or special category of personal data. These data are subject to a higher degree of protection under data privacy laws. Regardless of legal requirements to support privacy practices, such as in Privacy by Design (PbD) schemes, these practices are not yet commonly followed by software developers. The difficulty of developing privacy-preserving applications emphasises the importance of exploring the problems developers face to embed privacy techniques, suggesting the need for a supporting tool. An interactive IoT application design tool - PARROT (PrivAcy by design tool foR inteRnet Of Things) - is presented. This tool helps developers to design privacy-aware IoT applications, taking account of privacy compliance during the design process and providing real-time feedback on potential privacy violations. A user study with 18 developers was conducted, comprising a semi-structured interview and a design exercise to understand how developers typically handle privacy within the design process. Collaboration with a privacy lawyer was used to review designs produced by developers to uncover privacy limitations that could be addressed by developing a software tool. Based on the findings, a proof-of-concept prototype of PARROT was implemented and evaluated in two controlled lab studies. The outcome of the study indicates that IoT applications designed with PARROT addressed privacy concerns better and managed to reduce several of the limitations identified. From a privacy compliance perspective, PARROT helps developers to address compliance requirements throughout the design and testing process. This is achieved by incorporating privacy specific design features into the IoT application from the beginning rather than retrospectively

    Integrating TrustZone Protection with Communication Paths for Mobile Operating System

    Get PDF
    Nowadays, users perform various essential activities through their smartphones, including mobile payment and financial transaction. Therefore, users’ sensitive data processed by smartphones will be at risk if underlying mobile OSes are compromised. A technology called Trusted Execution Environment (TEE) has been introduced to protect sensitive data in the event of compromised OS and hypervisor. This dissertation points out the limitations of the current design model of mobile TEE, which has a low adoption rate among application developers and has a large size of Trusted Computing Base (TCB). It proposes a new design model for mobile TEE to increase the TEE adoption rate and to decrease the size of TCB. This dissertation applies a new model to protect mobile communication paths in the Android platform. Evaluations are performed to demonstrate the effectiveness of the proposed design model

    Défauts d'intégrité contextuelle liés à la collecte de données personnelles par des applications de médias sociaux sur Android

    Get PDF
    Les applications de médias sociaux démultiplient le volume de données contextuelles collectées corrélant numérique et environnement physique en temps réel. Cela a de nombreuses conséquences, encore peu appréhendées, pouvant porter atteinte à la vie privée de leurs usagers. Nous explorons la notion d’« intégrité contextuelle » lors de la collecte de données de dix applications hégémoniques de médias sociaux sur la plateforme mobile Android. À savoir, nous évaluons l’écart entre les attentes d’un utilisateur et l’accès effectif aux ressources du mobile et à ses données personnelles. Ce mémoire présente trois études complémentaires : 1. une étude a priori qui permet de cerner les pratiques de collecte annoncées (politiques de confidentialité, autorisations et permissions); 2. une analyse pratique qui instrumente le mobile pour recueillir la fréquence et les circonstances d’accès aux ressources de localisation et de messagerie texte, régulées par des autorisations; 3. une étude de la viabilité d’une solution permettant à l’utilisateur de paramétrer les permissions concédées aux applications en fonction du contexte d’utilisation. Nous mettons en lumière des défauts d’intégrité contextuelle, tant au niveau de l’étude a priori (politiques peu claires, incohérences, défauts structurels), que de l’étude pratique (captage de la localisation à chaque seconde pour certaines applications). La solution proposée permet de mitiger ces problèmes et a peu d’impact sur les fonctionnalités des applications
    corecore