10 research outputs found

    Characterizing Location-based Mobile Tracking in Mobile Ad Networks

    Full text link
    Mobile apps nowadays are often packaged with third-party ad libraries to monetize user data

    User Privacy Leakage in Location-based Mobile Ad Services

    Get PDF
    The online advertising ecosystem leverages its massive data collection capability to learn the properties of users for targeted ad deliveries. Many Android app developers include ad libraries in their apps as a way of monetization. These ad libraries contain advertisements from the sell-side platforms, which collect an extensive set of sensitive information to provide more relevant advertisements for their customers. Existing efforts have investigated the increasingly pervasive private data collection of mobile ad networks over time. However, there lacks a measurement study to evaluate the scale of privacy leakage of ad networks across different geographical areas. In this work, we present a measurement study of the potential privacy leakage in mobile advertising services conducted across different locations. We develop an automated measurement system to intercept mobile traffic at different locations and perform data analysis to pinpoint data collection behaviors of ad networks at both the app-level and organization-level. With 1,100 popular apps running across 10 different locations, we perform extensive threat assessments for different ad networks. Meanwhile, we explore the ad-blockers’ behavior in the ecosystem of ad networks, and whether those ad-blockers are actually capturing the users’ private data in the meantime of blocking the ads. We find that: the number of location-based ads tends to be positively related to the population density of locations, ad networks collect different types of data across different locations, and ad-blockers can block the private data leakage

    The COVID-19 Pandemic and the Technology Trust Gap

    Get PDF
    Industry and government tried to use information technologies to respond to the COVID-19 pandemic, but using the internet as a tool for disease surveillance, public health messaging, and testing logistics turned out to be a disappointment. Why weren’t these efforts more effective? This Essay argues that industry and government efforts to leverage technology were doomed to fail because tech platforms have failed over the past few decades to make their tools trustworthy, and lawmakers have done little to hold these companies accountable. People cannot trust the interfaces they interact with, the devices they use, and the systems that power tech companies’ services.This Essay explores these pre-existing privacy ills that contributed to these problems, including manipulative user interfaces, consent regimes that burden people with all the risks of using technology, and devices that collect far more data than they should. A pandemic response is only as good as its adoption, but pre-existing privacy and technology concerns make it difficult for people seeking lifelines to have confidence in the technologies designed to protect them. We argue that a good way to help close the technology trust gap is through relational duties of loyalty and care, better frameworks regulating the design of information technologies, and substantive rules limiting data collection and use instead of procedural “consent and control” rules. We conclude that the pandemic could prove to be an opportunity to leverage motivated lawmakers to improve our privacy frameworks and make information technologies worthy of our trust

    On Understanding Permission Usage Contextuality of Android Apps

    Get PDF
    In the runtime permission model, the context in which a permission is requested/used the first time may change later without the user's knowledge. Prior research identifies user dissatisfaction on varying contexts of permission use in the install-time permission model. However, the contextual use of permissions by the apps that are developed/adapted for the runtime permission model has not been studied. Our goal is to understand how permissions are requested and used in different contexts in the runtime permission model, and compare them to identify potential abuse. We present ContextDroid, a static analysis tool to identify the contexts of permission request and use. Using this tool, we analyze 38,838 apps (from a set of 62,340 apps) from the Google Play Store. We devise a mechanism following the best practices and permission policy enforcement by Google to flag apps for using permissions in potentially unexpected contexts. We flag 30.20\% of the 38,838 apps for using permissions in multiple and dissimilar contexts. Comparison with VirusTotal shows that non-contextual use of permissions can be linked to unwanted/malicious behaviour: 34.72\% of the 11,728 flagged apps are also detected by VirusTotal (i.e., 64.70\% of the 6,295 VirusTotal detected apps in our dataset). We find that most apps don't show any rationale if the user previously denied a permission. Furthermore, 13\% (from the 22,567 apps with identified request contexts) apps show behaviour similar to the install-time permission model by requesting all dangerous permissions when the app is first launched. We hope this thesis will bring attention to non-contextual permission usage in the runtime model, and may spur research into finer-grained permission control

    Silver Surfers on The Tech Wave: Privacy Analysis of Android Apps for The Elderly

    Get PDF
    Like other segments of the population, elderly people are also rapidly adopting the use of various mobile apps, and numerous apps are also being developed exclusively focusing on their specific needs. Mobile apps help the elderly to improve their daily lives and connectivity, their caregivers and family members to monitor their loved ones' well-being and health-related activities. While very useful, these apps also deal with a lot of sensitive private data such as healthcare reports, live location, and Personally Identifiable Information (PII) of the elderly and caregivers. While the privacy and security issues in mobile applications for the general population have been widely analyzed, there is limited work that focuses on elderly apps. We shed light on the privacy and security issues in mobile apps intended for elderly users, using a combination of dynamic and static analysis on 146 popular Android apps from the Google Play Store. To better understand some of these apps, we also test their corresponding IoT devices. Our analysis uncovers numerous security and privacy issues, leading to the leakage of private information and allowing adversaries to access user data. We find that 95/146 apps fail to adequately preserve the security and privacy of their users in one or more ways; specifically, 15 apps allow full account takeover, and 9 apps have an improper input validation check, where some of them allow an attacker to dump the database containing elderly and caregivers' sensitive information. We hope our study will raise awareness about the security and privacy risks introduced by these apps, and direct the attention of developers to strengthen their defensive measures

    Improving Android app security and privacy with developers

    Get PDF
    Existing research has uncovered many security vulnerabilities in Android applications (apps) caused by inexperienced, and unmotivated developers. Especially, the lack of tool support makes it hard for developers to avoid common security and privacy problems in Android apps. As a result, this leads to apps with security vulnerability that exposes end users to a multitude of attacks. This thesis presents a line of work that studies and supports Android developers in writing more secure code. We first studied to which extent tool support can help developers in creating more secure applications. To this end, we developed and evaluated an Android Studio extension that identifies common security problems of Android apps, and provides developers suggestions to more secure alternatives. Subsequently, we focused on the issue of outdated third-party libraries in apps which also is the root cause for a variety of security vulnerabilities. Therefore, we analyzed all popular 3rd party libraries in the Android ecosystem, and provided developers feedback and guidance in the form of tool support in their development environment to fix such security problems. In the second part of this thesis, we empirically studied and measured the impact of user reviews on app security and privacy evolution. Thus, we built a review classifier to identify security and privacy related reviews and performed regression analysis to measure their impact on the evolution of security and privacy in Android apps. Based on our results we proposed several suggestions to improve the security and privacy of Android apps by leveraging user feedbacks to create incentives for developers to improve their apps toward better versions.Die bisherige Forschung zeigt eine Vielzahl von Sicherheitslücken in Android-Applikationen auf, welche sich auf unerfahrene und unmotivierte Entwickler zurückführen lassen. Insbesondere ein Mangel an Unterstützung durch Tools erschwert es den Entwicklern, häufig auftretende Sicherheits- und Datenschutzprobleme in Android Apps zu vermeiden. Als Folge führt dies zu Apps mit Sicherheitsschwachstellen, die Benutzer einer Vielzahl von Angriffen aussetzen. Diese Dissertation präsentiert eine Reihe von Forschungsarbeiten, die Android-Entwickler bei der Entwicklung von sichereren Apps untersucht und unterstützt. In einem ersten Schritt untersuchten wir, inwieweit die Tool-Unterstützung Entwicklern beim Schreiben von sicherem Code helfen kann. Zu diesem Zweck entwickelten und evaluierten wir eine Android Studio-Erweiterung, die gängige Sicherheitsprobleme von Android-Apps identifiziert und Entwicklern Vorschläge für sicherere Alternativen bietet. Daran anknüpfend, konzentrierten wir uns auf das Problem veralteter Bibliotheken von Drittanbietern in Apps, die ebenfalls häufig die Ursache von Sicherheitslücken sein können. Hierzu analysierten wir alle gängigen 3rd-Party-Bibliotheken im Android-Ökosystem und gaben den Entwicklern Feedback und Anleitung in Form von Tool-Unterstützung in ihrer Entwicklungsumgebung, um solche Sicherheitsprobleme zu beheben. Im zweiten Teil dieser Dissertation untersuchten wir empirisch die Auswirkungen von Benutzer-Reviews im Android Appstore auf die Entwicklung der Sicherheit und des Datenschutzes von Apps. Zu diesem Zweck entwickelten wir einen Review-Klassifikator, welcher in der Lage ist sicherheits- und datenschutzbezogene Reviews zu identifizieren. Nachfolgend untersuchten wir den Einfluss solcher Reviews auf die Entwicklung der Sicherheit und des Datenschutzes in Android-Apps mithilfe einer Regressionsanalyse. Basierend auf unseren Ergebnissen präsentieren wir verschiedene Vorschläge zur Verbesserung der Sicherheit und des Datenschutzes von Android-Apps, welche die Reviews der Benutzer zur Schaffung von Anreizen für Entwickler nutzen

    Understanding and supporting app developers towards designing privacy-friendly apps for children

    Get PDF
    The integration of digital technology in contemporary society has led to children being exposed to and using mobile devices at younger ages. These devices have become an integral part of their daily routines and experiences, playing a crucial role in their socialisation and development. However, the use of these devices is not without drawbacks. The underlying infrastructure of many of the apps available on such devices heavily relies on a vast and intricate data-driven ecosystem. The proliferation of mobile app developers and numerous third-party and fourth-party entities heavily relies on the collection, sharing, transmission, and analysis of personal data, including that of children. The breach of privacy resulting from the extensive data tracking is prevalent and has detrimental effects on children, including the loss of autonomy and trust. In this thesis, we investigate this problem from the perspective of app developers. We begin by conducting a critical examination of the privacy landscape of popular children's apps in the UK market. In conjunction with a systematic literature review, we develop a research-driven method for evaluating privacy practices in mobile applications. By applying this methodology to a dataset of 137 'expert-approved' children's apps, we reveal that these apps extensively tracked children's data, while providing insufficient user-facing support for children to manage and negotiate these privacy behaviours. This finding raises the crucial question of barriers to designing privacy-friendly mobile apps for children. To explore this issue, we first conduct a mixed-method study with developers of children's apps, comprising 134 surveys and 20 interviews. Our findings show that while the developers are invested in the best interests of children, they encounter difficulties in navigating the complex data-driven ecosystem, understanding the behaviour of third-party libraries and trackers, as well as the pressure to monetise their apps through privacy-friendly alternatives. In light of these findings, we carry out a Research through Design approach to elicit latent needs from children's app developers, using a set of 12 ideas, generated through a workshop with design expert, aimed at addressing the identified challenges. These ideas are evaluated with a sample of 20 children's app developers to uncover a set of latent requirements for support, including a demand for increased transparency regarding third-party libraries and easy-to-adopt compliance checking against regulatory guidelines. Utilising the requirements gathered from the developers, we develop a web-based application that aims to provide transparency about the privacy behaviours of commonly used SDKs and third-party libraries for app developers. We ask a sample of 12 children's app developers to evaluate how features in our application may incentivise developers to consider privacy-friendly alternatives to commonly used SDKs, how they may plan to use it in their development practices, and how it may be improved in the future. The research in this thesis casts a crucial new perspective upon the current state of privacy in the mobile ecosystem, through carefully-designed observations and attempts to disrupt existing practices of app developers for children. Through this journey, we contribute to the HCI research community and related designers and regulatory bodies with fresh and original insights into the design and development of privacy-friendly mobile applications for children
    corecore