611 research outputs found

    Privacy and Security Analysis of mHealth Apps

    Get PDF
    The widespread availability of Mobile Health (mHealth) applications has been significantly accelerated by the outbreak of the COVID-19 pandemic. While bringing many benefits, from self-monitoring to medical consultations, mHealth apps process many sensitive health-related user data. Therefore, they are subject to privacy regulations set by government, such as General Data Protection Regulation (GDPR) in the EU and Health Insurance Portability and Accountability Act (HIPAA) in the USA, as well as privacy guidelines of the app store (e.g., Google Android). In this work, we analyze the privacy, compliance, and security of 232 mHealth apps in the Android ecosystem, mainly focusing on the most popular free apps (199), but also considering a sample of paid apps (25) and healthcare provider/clinician apps published on the US Centers for Disease Control and Prevention (CDC)'s website (8). For our analysis, we leverage both static approaches, such as privacy policy and APK analysis, and dynamic approaches, like network traffic inspection and analysis of in-app consent acquisition. Our findings reveal that 85.4\% of the free mHealth apps do not properly inform the users about all the aspects of the data processing required by the regulations. In addition, they often contain conflicting or incomplete information: only 2.51% of them are completely consistent. Moreover, 55.8% of these apps process user data without explicit consent. Our analysis shows that, when compared to free apps, paid ones are less careful in writing the privacy policy, while containing a lower number of trackers and dangerous permissions on average. We found that 76% of these apps fail in obtaining explicit consent and 84% of them process some types of data without informing the user. Concerning the CDC-endorsed apps, while we did not detect a pervasive presence of trackers, dangerous permissions or sensitive data in the network traffic, our results show that all of them have incomplete privacy policies and fail to ask for explicit consent before accessing their services. As we consider apps with a mean of 8 millions downloads each, our study impacts a lot of end-users and helps creating awareness of mHealth apps' privacy importance among both users and developers

    FAIR: Fuzzy Alarming Index Rule for Privacy Analysis in Smartphone Apps

    Get PDF
    In this paper, we introduce an approach that aims at increasing individuals’ privacy awareness. We perform a privacy risk assessment of the smartphone applications (apps) installed on a user’s device. We implemented an app behaviour monitoring tool that collects information about access to sensitive resources by each installed app. We then calculate a privacy risk score using a fuzzy logic based approach that considers type, number and frequency of access on resources. The combination of these two concepts provides the user with information about the privacy invasiveness level of the monitored apps. Our approach enables users to make informed privacy decisions, i.e. restrict permissions or report an app based on resource access events. We evaluate our approach by analysing the behaviour of selected apps and calculating their associated privacy score. Initial results demonstrate the applicability of our approach, which allows the comparison of apps by reporting to the user the detected events and the resulting privacy risk score

    “And all the pieces matter...” Hybrid Testing Methods for Android App's Privacy Analysis

    Get PDF
    Smartphones have become inherent to the every day life of billions of people worldwide, and they are used to perform activities such as gaming, interacting with our peers or working. While extremely useful, smartphone apps also have drawbacks, as they can affect the security and privacy of users. Android devices hold a lot of personal data from users, including their social circles (e.g., contacts), usage patterns (e.g., app usage and visited websites) and their physical location. Like in most software products, Android apps often include third-party code (Software Development Kits or SDKs) to include functionality in the app without the need to develop it in-house. Android apps and third-party components embedded in them are often interested in accessing such data, as the online ecosystem is dominated by data-driven business models and revenue streams like advertising. The research community has developed many methods and techniques for analyzing the privacy and security risks of mobile apps, mostly relying on two techniques: static code analysis and dynamic runtime analysis. Static analysis analyzes the code and other resources of an app to detect potential app behaviors. While this makes static analysis easier to scale, it has other drawbacks such as missing app behaviors when developers obfuscate the app’s code to avoid scrutiny. Furthermore, since static analysis only shows potential app behavior, this needs to be confirmed as it can also report false positives due to dead or legacy code. Dynamic analysis analyzes the apps at runtime to provide actual evidence of their behavior. However, these techniques are harder to scale as they need to be run on an instrumented device to collect runtime data. Similarly, there is a need to stimulate the app, simulating real inputs to examine as many code-paths as possible. While there are some automatic techniques to generate synthetic inputs, they have been shown to be insufficient. In this thesis, we explore the benefits of combining static and dynamic analysis techniques to complement each other and reduce their limitations. While most previous work has often relied on using these techniques in isolation, we combine their strengths in different and novel ways that allow us to further study different privacy issues on the Android ecosystem. Namely, we demonstrate the potential of combining these complementary methods to study three inter-related issues: • A regulatory analysis of parental control apps. We use a novel methodology that relies on easy-to-scale static analysis techniques to pin-point potential privacy issues and violations of current legislation by Android apps and their embedded SDKs. We rely on the results from our static analysis to inform the way in which we manually exercise the apps, maximizing our ability to obtain real evidence of these misbehaviors. We study 46 publicly available apps and find instances of data collection and sharing without consent and insecure network transmissions containing personal data. We also see that these apps fail to properly disclose these practices in their privacy policy. • A security analysis of the unauthorized access to permission-protected data without user consent. We use a novel technique that combines the strengths of static and dynamic analysis, by first comparing the data sent by applications at runtime with the permissions granted to each app in order to find instances of potential unauthorized access to permission protected data. Once we have discovered the apps that are accessing personal data without permission, we statically analyze their code in order to discover covert- and side-channels used by apps and SDKs to circumvent the permission system. This methodology allows us to discover apps using the MAC address as a surrogate for location data, two SDKs using the external storage as a covert-channel to share unique identifiers and an app using picture metadata to gain unauthorized access to location data. • A novel SDK detection methodology that relies on obtaining signals observed both in the app’s code and static resources and during its runtime behavior. Then, we rely on a tree structure together with a confidence based system to accurately detect SDK presence without the need of any a priory knowledge and with the ability to discern whether a given SDK is part of legacy or dead code. We prove that this novel methodology can discover third-party SDKs with more accuracy than state-of-the-art tools both on a set of purpose-built ground-truth apps and on a dataset of 5k publicly available apps. With these three case studies, we are able to highlight the benefits of combining static and dynamic analysis techniques for the study of the privacy and security guarantees and risks of Android apps and third-party SDKs. The use of these techniques in isolation would not have allowed us to deeply investigate these privacy issues, as we would lack the ability to provide real evidence of potential breaches of legislation, to pin-point the specific way in which apps are leveraging cover and side channels to break Android’s permission system or we would be unable to adapt to an ever-changing ecosystem of Android third-party companies.The works presented in this thesis were partially funded within the framework of the following projects and grants: • European Union’s Horizon 2020 Innovation Action program (Grant Agreement No. 786741, SMOOTH Project and Grant Agreement No. 101021377, TRUST AWARE Project). • Spanish Government ODIO NºPID2019-111429RB-C21/PID2019-111429RBC22. • The Spanish Data Protection Agency (AEPD) • AppCensus Inc.This work has been supported by IMDEA Networks InstitutePrograma de Doctorado en Ingeniería Telemática por la Universidad Carlos III de MadridPresidente: Srdjan Matic.- Secretario: Guillermo Suárez-Tangil.- Vocal: Ben Stoc

    Modeling security and privacy requirements: A use case-driven approach

    Get PDF
    Context: Modern internet-based services, ranging from food-delivery to home-caring, leverage the availability of multiple programmable devices to provide handy services tailored to end-user needs. These services are delivered through an ecosystem of device-specific software components and interfaces (e.g., mobile and wearable device applications). Since they often handle private information (e.g., location and health status), their security and privacy requirements are of crucial importance. Defining and analyzing those requirements is a significant challenge due to the multiple types of software components and devices integrated into software ecosystems. Each software component presents peculiarities that often depend on the context and the devices the component interact with, and that must be considered when dealing with security and privacy requirements. Objective: In this paper, we propose, apply, and assess a modeling method that supports the specification of security and privacy requirements in a structured and analyzable form. Our motivation is that, in many contexts, use cases are common practice for the elicitation of functional requirements and should also be adapted for describing security requirements. Method: We integrate an existing approach for modeling security and privacy requirements in terms of security threats, their mitigations, and their relations to use cases in a misuse case diagram. We introduce new security-related templates, i.e., a mitigation template and a misuse case template for specifying mitigation schemes and misuse case specifications in a structured and analyzable manner. Natural language processing can then be used to automatically report inconsistencies among artifacts and between the templates and specifications. Results: We successfully applied our approach to an industrial healthcare project and report lessons learned and results from structured interviews with engineers. Conclusion: Since our approach supports the precise specification and analysis of security threats, threat scenarios and their mitigations, it also supports decision making and the analysis of compliance to standards

    PRIVACY ASSURANCE AND NETWORK EFFECTS IN THE ADOPTION OF LOCATION-BASED SERVICES: AN IPHONE EXPERIMENT

    Get PDF
    The use of geospatially aware mobile devices and applications is increasing, along with the potential for the unethical use of personal location information. For example, iPhone apps often ask users if they can collect location data in order to make the program more useful. The purpose of this research is to empirically examine the significance of this new and increasingly relevant privacy dimension. Through a simulation experiment, we examine how the assurance of location information privacy (as well as mobile app quality and network size) influences users\u27 perceptions of location privacy risk and the utility associated with the app which, in turn, affects their adoption intentions and willingness-to-pay for the app. The results indicate that location privacy assurance is of great concern and that assurance is particularly important when the app’s network size is low or if its quality cannot be verified

    Unfolding Concerns about Augmented Reality Technologies: A Qualitative Analysis of User Perceptions

    Get PDF
    Augmented reality (AR) greatly diffused into the public consciousness in the last years, especially due to the success of mobile applications like PokĂŠmon Go. However, only few people experienced different forms of augmented reality like head-mounted displays (HMDs). Thus, people have only a limited actual experience with AR and form attitudes and perceptions towards this technology only partially based on actual use experiences, but mainly based on hearsay and narratives of others, like the media or friends. Thus, it is highly difficult for developers and product managers of AR solutions to address the needs of potential users. Therefore, we disentangle the perceptions of individuals with a focus on their concerns about AR. Perceived concerns are an important factor for the acceptance of new technologies. We address this research topic based on twelve intensive interviews with laymen as well as AR experts and analyze them with a qualitative research method

    Engineering Privacy in Smartphone Apps: A Technical Guideline Catalog for App Developers

    Get PDF
    With the rapid growth of technology in recent years, we are surrounded by or even dependent on the use of technological devices such as smartphones as they are now an indispensable part of our life. Smartphone applications (apps) provide a wide range of utilities such as navigation, entertainment, fitness, etc. To provide such context-sensitive services to users, apps need to access users' data including sensitive ones, which in turn, can potentially lead to privacy invasions. To protect users against potential privacy invasions in such a vulnerable ecosystem, legislation such as the European Union General Data Protection Regulation (EU GDPR) demands best privacy practices. Therefore, app developers are required to make their apps compatible with legal privacy principles enforced by law. However, this is not an easy task for app developers to comprehend purely legal principles to understand what needs to be implemented. Similarly, bridging the gap between legal principles and technical implementations to understand how legal principles need to be implemented is another barrier to develop privacy-friendly apps. To this end, this paper proposes a privacy and security design guide catalog for app developers to assist them in understanding and adopting the most relevant privacy and security principles in the context of smartphone apps. The presented catalog is aimed at mapping the identified legal principles to practical privacy and security solutions that can be implemented by developers to ensure enhanced privacy aligned with existing legislation. Through conducting a case study, it is confirmed that there is a significant gap between what developers are doing in reality and what they promise to do. This paper provides researchers and developers of privacy-related technicalities an overview of the characteristics of existing privacy requirements needed to be implemented in smartphone ecosystems, on which they can base their work

    Engineering Privacy in Smartphone Apps: A Technical Guideline Catalog for App Developers

    Get PDF
    With the rapid growth of technology in recent years, we are surrounded by or even dependent on the use of technological devices such as smartphones as they are now an indispensable part of our life. Smartphone applications (apps) provide a wide range of utilities such as navigation, entertainment, fitness, etc. To provide such context-sensitive services to users, apps need to access users' data including sensitive ones, which in turn, can potentially lead to privacy invasions. To protect users against potential privacy invasions in such a vulnerable ecosystem, legislation such as the European Union General Data Protection Regulation (EU GDPR) demands best privacy practices. Therefore, app developers are required to make their apps compatible with legal privacy principles enforced by law. However, this is not an easy task for app developers to comprehend purely legal principles to understand what needs to be implemented. Similarly, bridging the gap between legal principles and technical implementations to understand how legal principles need to be implemented is another barrier to develop privacy-friendly apps. To this end, this paper proposes a privacy and security design guide catalog for app developers to assist them in understanding and adopting the most relevant privacy and security principles in the context of smartphone apps. The presented catalog is aimed at mapping the identified legal principles to practical privacy and security solutions that can be implemented by developers to ensure enhanced privacy aligned with existing legislation. Through conducting a case study, it is confirmed that there is a significant gap between what developers are doing in reality and what they promise to do. This paper provides researchers and developers of privacy-related technicalities an overview of the characteristics of existing privacy requirements needed to be implemented in smartphone ecosystems, on which they can base their work
    • …
    corecore