34 research outputs found

    Shall I post this now? Optimized, delay-based privacy protection in social networks

    Get PDF
    The final publication is available at Springer via http://dx.doi.org/10.1007/s10115-016-1010-4Despite the several advantages commonly attributed to social networks such as easiness and immediacy to communicate with acquaintances and friends, significant privacy threats provoked by unexperienced or even irresponsible users recklessly publishing sensitive material are also noticeable. Yet, a different, but equally significant privacy risk might arise from social networks profiling the online activity of their users based on the timestamp of the interactions between the former and the latter. In order to thwart this last type of commonly neglected attacks, this paper proposes an optimized deferral mechanism for messages in online social networks. Such solution suggests intelligently delaying certain messages posted by end users in social networks in a way that the observed online activity profile generated by the attacker does not reveal any time-based sensitive information, while preserving the usability of the system. Experimental results as well as a proposed architecture implementing this approach demonstrate the suitability and feasibility of our mechanism.Peer ReviewedPostprint (author's final draft

    On the Measurement of Privacy as an Attacker's Estimation Error

    Get PDF
    A wide variety of privacy metrics have been proposed in the literature to evaluate the level of protection offered by privacy enhancing-technologies. Most of these metrics are specific to concrete systems and adversarial models, and are difficult to generalize or translate to other contexts. Furthermore, a better understanding of the relationships between the different privacy metrics is needed to enable more grounded and systematic approach to measuring privacy, as well as to assist systems designers in selecting the most appropriate metric for a given application. In this work we propose a theoretical framework for privacy-preserving systems, endowed with a general definition of privacy in terms of the estimation error incurred by an attacker who aims to disclose the private information that the system is designed to conceal. We show that our framework permits interpreting and comparing a number of well-known metrics under a common perspective. The arguments behind these interpretations are based on fundamental results related to the theories of information, probability and Bayes decision.Comment: This paper has 18 pages and 17 figure

    A Survey on Routing in Anonymous Communication Protocols

    No full text
    The Internet has undergone dramatic changes in the past 15 years, and now forms a global communication platform that billions of users rely on for their daily activities. While this transformation has brought tremendous benefits to society, it has also created new threats to online privacy, ranging from profiling of users for monetizing personal information to nearly omnipotent governmental surveillance. As a result, public interest in systems for anonymous communication has drastically increased. Several such systems have been proposed in the literature, each of which offers anonymity guarantees in different scenarios and under different assumptions, reflecting the plurality of approaches for how messages can be anonymously routed to their destination. Understanding this space of competing approaches with their different guarantees and assumptions is vital for users to understand the consequences of different design options. In this work, we survey previous research on designing, developing, and deploying systems for anonymous communication. To this end, we provide a taxonomy for clustering all prevalently considered approaches (including Mixnets, DC-nets, onion routing, and DHT-based protocols) with respect to their unique routing characteristics, deployability, and performance. This, in particular, encompasses the topological structure of the underlying network; the routing information that has to be made available to the initiator of the conversation; the underlying communication model; and performance-related indicators such as latency and communication layer. Our taxonomy and comparative assessment provide important insights about the differences between the existing classes of anonymous communication protocols, and it also helps to clarify the relationship between the routing characteristics of these protocols, and their performance and scalability

    Optimal forgery and suppression of ratings for privacy enhancement in recommendation systems

    Get PDF
    Recommendation systems are information-filtering systems that tailor information to users on the basis of knowledge about their preferences. The ability of these systems to profile users is what enables such intelligent functionality, but at the same time, it is the source of serious privacy concerns. In this paper we investigate a privacy-enhancing technology that aims at hindering an attacker in its efforts to accurately profile users based on the items they rate. Our approach capitalizes on the combination of two perturbative mechanisms—the forgery and the suppression of ratings. While this technique enhances user privacy to a certain extent, it inevitably comes at the cost of a loss in data utility, namely a degradation of the recommendation’s accuracy. In short, it poses a trade-off between privacy and utility. The theoretical analysis of such trade-off is the object of this work. We measure privacy as the Kullback-Leibler divergence between the user’s and the population’s item distributions, and quantify utility as the proportion of ratings users consent to forge and eliminate. Equipped with these quantitative measures, we find a closed-form solution to the problem of optimal forgery and suppression of ratings, an optimization problem that includes, as a particular case, the maximization of the entropy of the perturbed profile. We characterize the optimal trade-off surface among privacy, forgery rate and suppression rate,and experimentally evaluate how our approach could contribute to privacy protection in a real-world recommendation system.Peer ReviewedPostprint (published version

    Datenschutzfördernde Techniken für private Dienste

    Get PDF
    Privacy on the Internet is becoming more and more important, as an increasing part of everyday life takes place over the Internet. Internet users lose the ability to control which information they give away about themselves or are even not aware that they do so. Privacy-enhancing technologies help control private information on the Internet, for example, by anonymizing Internet communication. Up to now, work on privacy-enhancing technologies has mainly focused on privacy of users requesting public services. This thesis introduces a new privacy risk that occurs when private persons run their own services. One example are instant messaging systems which allow users to exchange presence information and text messages in real time. These systems usually do not provide protection of presence information which is stored on central servers. As an alternative, decentralized instant messaging system designs mitigate this problem by having private persons provide instant messaging services to each other. However, providing a service as a private person causes new security problems as compared to providing a service as an organization or enterprise: First, the presence of such a service reveals information about the availability of the service provider. Second, the server location needs to be concealed in order to hide the whereabouts of a person. Third, the server needs to be specifically protected from unauthorized access attempts. This thesis proposes to use pseudonymous services as a building block for private services. Pseudonymous services conceal the location of a server that provides a specific service. The contribution made here is to analyze what parts of pseudonymous services, in particular Tor hidden services, are missing in order to apply them for private services. This analysis leads to three main problems for which solutions are proposed: First, known pseudonymous service designs do not scale to the expected number of private services which might be provided in the future. This thesis proposes a new approach to store hidden service descriptors in a distributed data structure rather than on central servers. A particular focus lies on the support of private entries which are required for private services. Second, pseudonymous services leak too much information about service identity during advertisement in the network and connection establishment by clients. The approach taken in this thesis is to reduce the information that a service publishes in the network to a minimum and prevent unauthorized clients from accessing a service already during connection establishment. These changes protect service activity and usage patterns from non-authorized entities. Third, pseudonymous services exhibit worse performance than direct service access. The contribution of this thesis is to measure performance, identify possible problems, and propose improvements.Privatsphäre im Internet wird immer wichtiger, da ein zunehmender Teil des alltäglichen Lebens über das Internet stattfindet. Internet-Benutzer verlieren die Fähigkeit zu steuern, welche Informationen sie über sich weitergeben oder wissen nicht einmal, dass sie dieses tun. Datenschutzfördernde Techniken helfen dabei, private Informationen im Internet zu kontrollieren, zum Beispiel durch die Anonymisierung von Internetkommunikation. Bis heute liegt der Fokus bei datenschutzfördernden Techniken hauptsächlich auf dem Schutz von Anfragen an öffentliche Dienste. Diese Arbeit wirft die Frage nach den Risiken beim Betrieb von Internetdiensten durch Privatpersonen auf. Ein Beispiel hierfür sind Instant-Messaging-Systeme, die es ermöglichen, Anwesenheitsinformationen und Textnachrichten in Echtzeit auszutauschen. Üblicherweise schützen diese Systeme die Anwesenheitsinformationen, die auf zentralen Servern gespeichert werden, nicht besonders. Als Alternative verringern dezentrale Instant-Messaging-Systeme dieses Problem, indem Privatpersonen sich gegenseitig Dienste anbieten. Allerdings bringt das Anbieten eines Dienstes als Privatperson im Vergleich zu Organisationen oder Unternehmen neue Sicherheitsprobleme mit sich: Erstens werden durch die Verfügbarkeit eines solchen Dienstes Informationen über die Präsenz des Dienstanbieters preisgegeben. Zweitens soll der Standort des Servers unerkannt bleiben, um nicht den Aufenthaltsort des Dienstanbieters zu offenbaren. Drittens muss der Server besonders vor unautorisierten Zugriffsversuchen geschützt werden. Diese Arbeit schlägt die Nutzung von pseudonymen Diensten als Baustein von privaten Diensten vor. Pseudonyme Dienste verbergen den Standort eines Servers, der einen bestimmten Dienst anbietet. Der hier geleistete Beitrag soll herausfinden, welche Teile von pseudonymen Diensten, besonders von Tor Hidden Services, fehlen, um sie für private Dienste einzusetzen. Dies führt zu drei Hauptproblemen, zu denen Lösungen vorgeschlagen werden: Erstens skalieren bisherige Ansätze für pseudonyme Dienste nicht für die in Zukunft zu erwartende Anzahl von privaten Diensten. Diese Arbeit schlägt einen neuen Ansatz vor, der Hidden-Service-Beschreibungen in einer verteilten Datenstruktur ablegt, anstatt sie auf zentralen Servern zu speichern. Ein besonderer Fokus liegt auf der Unterstützung von privaten Einträgen, die für private Dienste benötigt werden. Zweitens geben pseudonyme Dienste während des Anbietens im Netzwerk und der Verbindungsherstellung durch Clients zu viele Informationen über die Identität des Dienstes preis. Der in dieser Arbeit verfolgte Ansatz ist, die Informationen, die ein Dienst im Netzwerk bekanntgibt, auf ein Minimum zu reduzieren und nicht-autorisierte Clients am Zugriff auf den Dienst schon während der Verbindungsherstellung zu hindern. Diese Änderungen schützen die Aktivität und das Nutzungsmuster des Dienstes vor nicht-autorisierten Personen. Drittens weisen pseudonyme Dienste eine schlechtere Effizienz auf als Dienste, auf die direkt zugegriffen wird. Der Beitrag dieser Arbeit ist, die Effizienz zu messen, mögliche Probleme zu identifizieren und Verbesserungen vorzuschlagen

    A Survey on Routing in Anonymous Communication Protocols

    Get PDF
    The Internet has undergone dramatic changes in the past 2 decades and now forms a global communication platform that billions of users rely on for their daily activities. While this transformation has brought tremendous benefits to society, it has also created new threats to online privacy, such as omnipotent governmental surveillance. As a result, public interest in systems for anonymous communication has drastically increased. In this work, we survey previous research on designing, developing, and deploying systems for anonymous communication. Our taxonomy and comparative assessment provide important insights about the differences between the existing classes of anonymous communication protocols

    Hang With Your Buddies to Resist Intersection Attacks

    Full text link
    Some anonymity schemes might in principle protect users from pervasive network surveillance - but only if all messages are independent and unlinkable. Users in practice often need pseudonymity - sending messages intentionally linkable to each other but not to the sender - but pseudonymity in dynamic networks exposes users to intersection attacks. We present Buddies, the first systematic design for intersection attack resistance in practical anonymity systems. Buddies groups users dynamically into buddy sets, controlling message transmission to make buddies within a set behaviorally indistinguishable under traffic analysis. To manage the inevitable tradeoffs between anonymity guarantees and communication responsiveness, Buddies enables users to select independent attack mitigation policies for each pseudonym. Using trace-based simulations and a working prototype, we find that Buddies can guarantee non-trivial anonymity set sizes in realistic chat/microblogging scenarios, for both short-lived and long-lived pseudonyms.Comment: 15 pages, 8 figure

    Blindspot: Indistinguishable Anonymous Communications

    Get PDF
    Communication anonymity is a key requirement for individuals under targeted surveillance. Practical anonymous communications also require indistinguishability - an adversary should be unable to distinguish between anonymised and non-anonymised traffic for a given user. We propose Blindspot, a design for high-latency anonymous communications that offers indistinguishability and unobservability under a (qualified) global active adversary. Blindspot creates anonymous routes between sender-receiver pairs by subliminally encoding messages within the pre-existing communication behaviour of users within a social network. Specifically, the organic image sharing behaviour of users. Thus channel bandwidth depends on the intensity of image sharing behaviour of users along a route. A major challenge we successfully overcome is that routing must be accomplished in the face of significant restrictions - channel bandwidth is stochastic. We show that conventional social network routing strategies do not work. To solve this problem, we propose a novel routing algorithm. We evaluate Blindspot using a real-world dataset. We find that it delivers reasonable results for applications requiring low-volume unobservable communication.Comment: 13 Page
    corecore