236 research outputs found
Novel approaches to anonymity and privacy in decentralized, open settings
The Internet has undergone dramatic changes in the last two decades, evolving from a mere communication network to a global multimedia platform in which billions of users actively exchange information. While this transformation has brought tremendous benefits to society, it has also created new threats to online privacy that existing technology is failing to keep pace with. In this dissertation, we present the results of two lines of research that developed two novel approaches to anonymity and privacy in decentralized, open settings. First, we examine the issue of attribute and identity disclosure in open settings and develop the novel notion of (k,d)-anonymity for open settings that we extensively study and validate experimentally. Furthermore, we investigate the relationship between anonymity and linkability using the notion of (k,d)-anonymity and show that, in contrast to the traditional closed setting, anonymity within one online community does necessarily imply unlinkability across different online communities in the decentralized, open setting. Secondly, we consider the transitive diffusion of information that is shared in social networks and spread through pairwise interactions of user connected in this social network. We develop the novel approach of exposure minimization to control the diffusion of information within an open network, allowing the owner to minimize its exposure by suitably choosing who they share their information with. We implement our algorithms and investigate the practical limitations of user side exposure minimization in large social networks. At their core, both of these approaches present a departure from the provable privacy guarantees that we can achieve in closed settings and a step towards sound assessments of privacy risks in decentralized, open settings.Das Internet hat in den letzten zwei Jahrzehnten eine drastische Transformation erlebt und entwickelte sich dabei von einem einfachen Kommunikationsnetzwerk zu einer globalen Multimedia Plattform auf der Milliarden von Nutzern aktiv Informationen austauschen. Diese Transformation hat zwar einen gewaltigen Nutzen und vielfältige Vorteile für die Gesellschaft mit sich gebracht, hat aber gleichzeitig auch neue Herausforderungen und Gefahren für online Privacy mit sich gebracht mit der die aktuelle Technologie nicht mithalten kann. In dieser Dissertation präsentieren wir zwei neue Ansätze für Anonymität und Privacy in dezentralisierten und offenen Systemen. Mit unserem ersten Ansatz untersuchen wir das Problem der Attribut- und Identitätspreisgabe in offenen Netzwerken und entwickeln hierzu den Begriff der (k, d)-Anonymität für offene Systeme welchen wir extensiv analysieren und anschließend experimentell validieren. Zusätzlich untersuchen wir die Beziehung zwischen Anonymität und Unlinkability in offenen Systemen mithilfe des Begriff der (k, d)-Anonymität und zeigen, dass, im Gegensatz zu traditionell betrachteten, abgeschlossenen Systeme, Anonymität innerhalb einer Online Community nicht zwingend die Unlinkability zwischen verschiedenen Online Communitys impliziert. Mit unserem zweiten Ansatz untersuchen wir die transitive Diffusion von Information die in Sozialen Netzwerken geteilt wird und sich dann durch die paarweisen Interaktionen von Nutzern durch eben dieses Netzwerk ausbreitet. Wir entwickeln eine neue Methode zur Kontrolle der Ausbreitung dieser Information durch die Minimierung ihrer Exposure, was dem Besitzer dieser Information erlaubt zu kontrollieren wie weit sich deren Information ausbreitet indem diese initial mit einer sorgfältig gewählten Menge von Nutzern geteilt wird. Wir implementieren die hierzu entwickelten Algorithmen und untersuchen die praktischen Grenzen der Exposure Minimierung, wenn sie von Nutzerseite für große Netzwerke ausgeführt werden soll. Beide hier vorgestellten Ansätze verbindet eine Neuausrichtung der Aussagen die diese bezüglich Privacy treffen: wir bewegen uns weg von beweisbaren Privacy Garantien für abgeschlossene Systeme, und machen einen Schritt zu robusten Privacy Risikoeinschätzungen für dezentralisierte, offene Systeme in denen solche beweisbaren Garantien nicht möglich sind
The access control double bind: how everyday interfaces regulate access and privacy, enable surveillance, and enforce identity.
Access controls are an inescapable and deceptively mundane requirement for accessing digital applications and platforms. These systems enable and enforce practices related to access, ownership, privacy, and surveillance. Companies use access controls to dictate and enforce terms of use for digital media, platforms, and technologies. The technical implementation of these systems is well understood. However, this paper instead uses digital game software and platforms as a case study to analyze the broader socio-technical, and often inequitable, interactions these elements regulate across software systems. Our sample includes 200 digital games and seven major digital gaming platforms. We combine close reading and content analysis to examine the processes of authentication and authorization within our samples. While the ubiquity of these systems is a given in much academic and popular discourse, our data help empirically ground this understanding and examine how these systems support user legibility and surveillance, and police identities in under-examined ways. We suggest changes to the policies and practices that shape these systems to drive more transparent and equitable design
A Privacy by Design Methodology Application in Telecom Domain
Telecommunication has been considerably developed over the last decades, notably through optical fiber submarine cables and wireless connections offering voice and data wide range services. Telecommunication infrastructures are the necessary backbone that make possible any voice and data exchange. Unfortunately, these infrastructures are still suffering from various vulnerabilities and continue to be target of specific cyber-attacks. Some of these attacks could lead to service deniability, integrity and privacy loss. Against this fact, it’s also established that telecom service providers, as the owner of this infrastructure,can have access to huge data, even personal data related to customer and to their employees. Basically, this personal data is related directly to the customer’s and employee’s identity, geolocation, interest areas and contact circle, etc., when it comes to the use of this personal data, the privacy concerns become a big challenge for telecom service providers due to heavy impact that can induce. Given the personal data protection criticality in telecom domain, privacy by design PbD should be incorporate. Then, this article aims to apply in telecom service providers ISPM methodology "Information System Privacy Methodology" which focuses on PbD implementation in enterprises architecture, specifically in information systems taking into account all technical and organizational aspects
Recommended from our members
Addressing the Failure of Anonymization: Guidance from the European Union’s General Data Protection Regulation
It is common practice for companies to “anonymize” the consumer data that they collect. In fact, U.S. data protection laws and Federal Trade Commission guidelines encourage the practice of anonymization by exempting anonymized data from the privacy and data security requirements they impose. Anonymization involves removing personally identifiable information (“PII”) from a dataset so that, in theory, the data cannot be traced back to its data subjects. In practice, however, anonymization fails to irrevocably protect consumer privacy due to the potential for deanonymization—the linking of anonymized data to auxiliary information to re-identify data subjects. Because U.S. data protection laws provide safe harbors for anonymized data, re-identified data subjects receive no statutory privacy protections at all—a fact that is particularly troublesome given consumers’ dependence on technology and today’s climate of ubiquitous data collection.
By adopting an all-or-nothing approach to anonymization, the United States has created no means of incentivizing the practice of anonymization while still providing data subjects statutory protections. This Note argues that the United States should look to the risk-based approach taken by the European Union under the General Data Protection Regulation. Their data protection laws utilize multiple tiers of anonymization, which vary in their potential for deanonymization. Under this approach, pseudonymized data—i.e., certain data that has had PII removed but can still be linked to auxiliary information to re-identify data subjects—falls within the scope of the governing law, but receives relaxed requirements designed to incentivize pseudonymization and thereby reduce the risk of data subject identification. This approach both strikes a balance between data privacy and data utility, and affords data subjects the benefit of anonymity in addition to statutory protections ranging from choice to transparency
Deanonymizing tor hidden service users through bitcoin transactions analysis
With the rapid increase of threats on the Internet, people are continuously seeking privacy and anonymity. Services such as Bitcoin and Tor were introduced to provide anonymity for online transactions and Web browsing. Due to its pseudonymity model, Bitcoin lacks retroactive operational security, which means historical pieces of information could be used to identify a certain user. We investigate the feasibility of deanonymizing users of Tor hidden services who rely on Bitcoin as a method of payment. In particular, we correlate the public Bitcoin addresses of users and services with their corresponding transactions in the Blockchain. In other words, we establish a provable link between a Tor hidden service and its user by simply showing a transaction between their two corresponding addresses. This subtle information leakage breaks the anonymity of users and may have serious privacy consequences, depending on the sensitivity of the use case.
To demonstrate how an adversary can deanonymize hidden service users by exploiting leaked information from Bitcoin over Tor, we carried out a real-world experiment as a proof-of-concept. First, we collected public Bitcoin addresses of Tor hidden services from their .onion landing pages. Out of 1.5K hidden services we crawled, we found 88 unique Bitcoin addresses that have a healthy economic activity in 2017. Next, we collected public Bitcoin addresses from two channels of online social networks, namely, Twitter and the BitcoinTalk forum. Out of 5B tweets and 1M forum pages, we found 4.2K and 41K unique online identities, respectively, along with their public personal information and Bitcoin addresses. We then expanded the lists of Bitcoin addresses using closure analysis, where a Bitcoin address is used to identify a set of other addresses that are highly likely to be controlled by the same user. This allowed us to collect thousands more Bitcoin addresses for the users. By analyzing the transactions in the Blockchain, we were able to link up to 125 unique users to various hidden services, including sensitive ones, such as The Pirate Bay, Silk Road, and WikiLeaks. Finally, we traced concrete case studies to demonstrate the privacy implications of information leakage and user deanonymization. In particular, we show that Bitcoin addresses should always be assumed as compromised and can be used to deanonymize users
A review of privacy and usability issues in mobile health systems: Role of external factors
The increased penetration of mobile devices has created opportunities in the health sector and led to emerging of mobile health systems. As much as the mobile health systems have registered tremendous progress, they have been faced with privacy and usability issues. Due to the sensitivity of health information, there is an ethical need to equip mobile health systems with adequate privacy measures. However, these systems should also be useable by the intended users. Even though many researchers are working on solutions, the issues still persist. External factors such as cultural differences have also contributed to the issues, yet they have been under researched. In this article, we conduct a systematic literature review of 22 articles, categorize and present privacy and usability issues and possible solutions. We then discuss the relevance and implications of external factors to the findings on privacy and usability. We end with recommendations to address these external factors.Peer Reviewe
Trajectory-Based Spatiotemporal Entity Linking
Trajectory-based spatiotemporal entity linking is to match the same moving
object in different datasets based on their movement traces. It is a
fundamental step to support spatiotemporal data integration and analysis. In
this paper, we study the problem of spatiotemporal entity linking using
effective and concise signatures extracted from their trajectories. This
linking problem is formalized as a k-nearest neighbor (k-NN) query on the
signatures. Four representation strategies (sequential, temporal, spatial, and
spatiotemporal) and two quantitative criteria (commonality and unicity) are
investigated for signature construction. A simple yet effective dimension
reduction strategy is developed together with a novel indexing structure called
the WR-tree to speed up the search. A number of optimization methods are
proposed to improve the accuracy and robustness of the linking. Our extensive
experiments on real-world datasets verify the superiority of our approach over
the state-of-the-art solutions in terms of both accuracy and efficiency.Comment: 15 pages, 3 figures, 15 table
- …