19 research outputs found

    Use of apps in the COVID-19 response and the loss of privacy protection

    Get PDF
    Mobile apps provide a convenient source of tracking and data collection to fight against the spread of COVID-19. We report our analysis of 50 COVID-19-related apps, including their use and their access to personally identifiable information, to ensure that the right to privacy and civil liberties are protected.Ope

    Preserving Privacy in Cyber-physical-social systems: An Anonymity and Access Control Approach

    Get PDF
    With the significant development of mobile commerce, the integration of physical, social, and cyber worlds is increasingly common. The term Cyber Physical Social Systems is used to capture technology’s human-centric role. With the revolutionization of CPSS, privacy protections become a major concern for both customers and enterprises. Although data generalization by obfuscation and anonymity can provide protection for an individual’s privacy, overgeneralization may lead to less-valuable data. In this paper, we contrive generalization boundary techniques (k-anonymity) to maximize data usability while minimizing disclosure with a privacy access control mechanism. This paper proposes a combination of purpose-based access control models with an anonymity technique in distributed computing environments for privacy preserving policies and mechanisms that demonstrate policy conflicting problems. This combined approach will provide protections for individual personal information and make data sharable to authorized party with proper purposes. Here, we have examined data with k-anonymity to create a specific level of obfuscation that maintains the usefulness of data and used a heuristic approach to a privacy access control framework in which the privacy requirement is to satisfy the k-anonymity. The extensive experiments on both real-world and synthetic data sets show that the proposed privacy aware access control model with k- anonymity is practical and effective. It will generate an anonymized data set in accordance with the privacy clearance of a certain request and allow users access at different privacy levels, fulfilling some set of obligations and addressing privacy and utility requirements, flexible access control, and improved data availability, while guaranteeing a certain level of privacy.Ope

    Towards a Comprehensive Set of PII for Ensuring Privacy Protections

    Get PDF
    Personal Identifiable Information (PII) refers to any information that can be used to trace or identify an individual. With increasing online communication and a remote workforce, sharing PII has become mainstream online. In turn, this allows adversaries to attack account users’ systems, and impact users financially, economically, and affect their reputation. While the Internet, innovation and industrialization has become the important part of our social and economic structure as a natural component, each individual’s development depends on reliable and resilient infrastructure. Since the Internet is an unavoidable resource in our everyday life, this has become necessary to ensure safe and secure communication among different parties to enhance technological capabilities of industrial sectors all over the world. Industries are liable to keep people in society safe in online environments, which makes this a good time to consider a sustainable development plan to ensure security and privacy when preserving online communication for individuals. There are different mechanisms that exist to provide users with a certain level of privacy and safety. With the overarching technological development, it has become complicated to measure and handle PII (directly or indirectly) considering the recent setting of piecewise protection for different data types. In our study, we detail how organizations provide protection for different data types among PII. In addition, we have conducted a short study that analyzes online social data privacy on Facebook and Reddit in regards to how they handle collected data. Finally, we offer several paths for future research that must be considered for a comprehensive privacy protection program for users’ PII when developing resilient infrastructure, including regional and transborder.Ope

    What is in Your App? Uncovering Privacy Risks of Female Health Applications

    Full text link
    FemTech or Female Technology, is an expanding field dedicated to providing affordable and accessible healthcare solutions for women, prominently through Female Health Applications that monitor health and reproductive data. With the leading app exceeding 1 billion downloads, these applications are gaining widespread popularity. However, amidst contemporary challenges to women's reproductive rights and privacy, there is a noticeable lack of comprehensive studies on the security and privacy aspects of these applications. This exploratory study delves into the privacy risks associated with seven popular applications. Our initial quantitative static analysis reveals varied and potentially risky permissions and numerous third-party trackers. Additionally, a preliminary examination of privacy policies indicates non-compliance with fundamental data privacy principles. These early findings highlight a critical gap in establishing robust privacy and security safeguards for FemTech apps, especially significant in a climate where women's reproductive rights face escalating threats.Comment: 9 pages, 3 table

    Information Privacy: Current and future research directions

    Get PDF
    As more and more data is collected, concerns about information privacy become more and more salient. Information privacy is an inter- and multi-disciplinary subject relevant to researchers throughout the iSchools community. This full-day workshop will bring researchers together to discuss current and future research directions in information privacy and how iSchools can respond to the forthcoming National Privacy Research Strategy. Through a keynote presentation, plenary speakers, position papers, and group discussion, participants will explore current privacy research issues and their relevance to information research conducted by the iSchools community. Privacy scholars may submit position papers on their research projects and future directions to the workshop website for selection for presentation during the afternoon of the workshop. In addition to discussions of the presentations, during breakout sessions the workshop participants will seek to define new information privacy research questions for future work by iSchools scholars.Ope

    Information Privacy and Data Control in Cloud Computing: Consumers, Privacy Preferences, and Market Efficiency

    Full text link
    So many of our daily activities now take place “in the cloud,” where we use our devices to tap into massive networks that span the globe. Virtually every time that we plug into a new service, the service requires us to click the seemingly ubiquitous box indicating that we have read and agreed to the provider’s terms of service (TOS) and privacy policy. If a user does not click on this box, he is denied access to the service, but agreeing to these terms without reading them can negatively impact the user’s legal rights. As part of this work, we analyzed and categorized the terms of TOS agreements and privacy policies of several major cloud services to aid in our assessment of the state of user privacy in the cloud. Our empirical analysis showed that providers take similar approaches to user privacy and were consistently more detailed when describing the user’s obligations to the provider than when describing the provider’s obligations to the user. This asymmetry, combined with these terms’ nonnegotiable nature, led us to conclude that the current approach to user privacy in the cloud is in need of serious revision. In this Article, we suggest adopting a legal regime that requires companies to provide baseline protections for personal information and also to take steps to enhance the parties’ control over their own data. We emphasize the need for a regime that allows for “data control” in the cloud, which we define as consisting of two parts: (1) the ability to withdraw data and require a service provider to stop using or storing the user’s information (data withdrawal); and (2) the ability to move data to a new location without being locked into a particular provider (data mobility). Ultimately, our goal with this piece is to apply established law and privacy theories to services in the cloud and set forth a model for the protection of information privacy that recognizes the importance of informed and empowered users

    Mapping Risk Assessment Strategy for COVID-19 Mobile Apps’ Vulnerabilities

    Get PDF
    Recent innovations in mobile technologies are playing an important and vital role in combating the COVID-19 pandemic. While mobile apps’ functionality plays a crucial role in tackling the COVID-19 spread, it is also raising concerns about the associated privacy risks that users may face. Recent research studies have showed various technological measures on mobile applications that lack consideration of privacy risks in their data practices. For example, security vulnerabilities in COVID-19 apps can be exploited and therefore also pose privacy violations. In this paper, we focus on recent and newly developed COVID-19 apps and consider their threat landscape. Our objective was to identify security vulnerabilities that can lead to user-level privacy risks. We also formalize our approach by measuring the level of risk associated with assets and services that attackers may be targeting to capture during the exploitation. We utilized baseline risk assessment criteria within the scope of three specific security vulnerabilities that often exists in COVID-19 applications namely credential leaks, insecure communication, and HTTP request libraries. We present a proof of concept implementation for risk assessment of COVID-19 apps that can be utilized to evaluate privacy risk by the impact of assets and threat likelihood.Ope

    Stance classification of Twitter debates: The encryption debate as a use case

    Get PDF
    Social media have enabled a revolution in user-generated content. They allow users to connect, build community, produce and share content, and publish opinions. To better understand online users’ attitudes and opinions, we use stance classification. Stance classification is a relatively new and challenging approach to deepen opinion mining by classifying a user's stance in a debate. Our stance classification use case is tweets that were related to the spring 2016 debate over the FBI’s request that Apple decrypt a user’s iPhone. In this “encryption debate,” public opinion was polarized between advocates for individual privacy and advocates for national security. We propose a machine learning approach to classify stance in the debate, and a topic classification that uses lexical, syntactic, Twitter-specific, and argumentative features as a predictor for classifications. Models trained on these feature sets showed significant increases in accuracy relative to the unigram baseline.Ope

    Eight months into the COVID-19 Pandemic- Do Users Expect less Privacy?

    Get PDF
    While mobile apps provide immense benefits for the containment of the spread of COVID-19, privacy and security of these digital tracing apps are at the center of public debate. To understand users concerns and preferences for using a COVID-19 app, we conducted 2 surveys 3 months apart to determine what kind of privacy protections users were seeking in these apps and if those expectations change over time. Our survey results from participants N1: 2294 and N2: 2140 indicate that, trust plays a vital role in their adoption decision of such apps. Additionally, users’ preference for certain privacy protection and transparency revealed a disconnect between what technologists’ and users’ expectation. In this paper, we present our survey results to formalize design guidelines for app designers, providers and relevant authority. We recommend three important mechanisms of trust, preferences, and transparency that could greatly influence users’ adoption of such apps and can provide critical design components that can satisfy users’ expected privacy protections.Ope
    corecore