3,486 research outputs found

    Hang With Your Buddies to Resist Intersection Attacks

    Full text link
    Some anonymity schemes might in principle protect users from pervasive network surveillance - but only if all messages are independent and unlinkable. Users in practice often need pseudonymity - sending messages intentionally linkable to each other but not to the sender - but pseudonymity in dynamic networks exposes users to intersection attacks. We present Buddies, the first systematic design for intersection attack resistance in practical anonymity systems. Buddies groups users dynamically into buddy sets, controlling message transmission to make buddies within a set behaviorally indistinguishable under traffic analysis. To manage the inevitable tradeoffs between anonymity guarantees and communication responsiveness, Buddies enables users to select independent attack mitigation policies for each pseudonym. Using trace-based simulations and a working prototype, we find that Buddies can guarantee non-trivial anonymity set sizes in realistic chat/microblogging scenarios, for both short-lived and long-lived pseudonyms.Comment: 15 pages, 8 figure

    Conscript Your Friends into Larger Anonymity Sets with JavaScript

    Full text link
    We present the design and prototype implementation of ConScript, a framework for using JavaScript to allow casual Web users to participate in an anonymous communication system. When a Web user visits a cooperative Web site, the site serves a JavaScript application that instructs the browser to create and submit "dummy" messages into the anonymity system. Users who want to send non-dummy messages through the anonymity system use a browser plug-in to replace these dummy messages with real messages. Creating such conscripted anonymity sets can increase the anonymity set size available to users of remailer, e-voting, and verifiable shuffle-style anonymity systems. We outline ConScript's architecture, we address a number of potential attacks against ConScript, and we discuss the ethical issues related to deploying such a system. Our implementation results demonstrate the practicality of ConScript: a workstation running our ConScript prototype JavaScript client generates a dummy message for a mix-net in 81 milliseconds and it generates a dummy message for a DoS-resistant DC-net in 156 milliseconds.Comment: An abbreviated version of this paper will appear at the WPES 2013 worksho

    Privacy, Visibility, Transparency, and Exposure

    Get PDF
    This essay considers the relationship between privacy and visibility in the networked information age. Visibility is an important determinant of harm to privacy, but a persistent tendency to conceptualize privacy harms and expectations in terms of visibility has created two problems. First, focusing on visibility diminishes the salience and obscures the operation of nonvisual mechanisms designed to render individual identity, behavior, and preferences transparent to third parties. The metaphoric mapping to visibility suggests that surveillance is simply passive observation, rather than the active production of categories, narratives, and, norms. Second, even a broader conception of privacy harms as a function of informational transparency is incomplete. Privacy has a spatial dimension as well as an informational dimension. The spatial dimension of the privacy interest, which the author characterizes as an interest in avoiding or selectively limiting exposure, concerns the structure of experienced space. It is not negated by the fact that people in public spaces expect to be visible to others present in those spaces, and it encompasses both the arrangement of physical spaces and the design of networked communications technologies. U.S. privacy law and theory currently do not recognize this interest at all. This essay argues that they should

    TARANET: Traffic-Analysis Resistant Anonymity at the NETwork layer

    Full text link
    Modern low-latency anonymity systems, no matter whether constructed as an overlay or implemented at the network layer, offer limited security guarantees against traffic analysis. On the other hand, high-latency anonymity systems offer strong security guarantees at the cost of computational overhead and long delays, which are excessive for interactive applications. We propose TARANET, an anonymity system that implements protection against traffic analysis at the network layer, and limits the incurred latency and overhead. In TARANET's setup phase, traffic analysis is thwarted by mixing. In the data transmission phase, end hosts and ASes coordinate to shape traffic into constant-rate transmission using packet splitting. Our prototype implementation shows that TARANET can forward anonymous traffic at over 50~Gbps using commodity hardware

    Missing Privacy Through Individuation: the Treatment of Privacy Law in the Canadian Case Law on Hate, Obscenity, and Child Pornography

    Get PDF
    Privacy is approached differently in the Canadian case law on child pornography than in hate propaganda and obscenity cases. Privacy analyses in all three contexts focus considerable attention on the interests of the individuals accused, particularly in relation to minimizing state intrusion on private spheres of activity However, the privacy interests of the.equality-seeking communities targeted by these forms of communication are more directly addressed in child pornography cases than in hate propaganda and obscenity cases. One possible explanation for this difference is that hate propaganda and obscenity simply do not affect the privacy interests of targeted groups and their members. In contrast, this paper suggests that this difference in approach reflects the adoption of an individualistic approach to privacy that may unnecessarily place it in tension with equality. In so doing, it sets the stage for an exploration of more social approaches to privacy that may better enable exploration of privacy\u27s intersections with equality and its collective value to the community as a whole

    The fair information principles: a comparison of U.S. and Canadian Privacy Policy as applied to the private sector

    Get PDF
    U.S. consumers are worried about their privacy and their personal information. High profile cases of identity theft involving companies losing the private information of hundreds of thousands of customers have only served to elevate the mistrust consumers have for companies that collect and share their personal information. The Federal Trade Commission (FTC) is charged with protecting U.S. consumers from fraud, deception, and unfair business practices in the marketplace; a task made difficult by an overarching need to balance the rights of the individuals against the security needs of the country and the free flow of information required by a free market economy. The FTC has asked U.S. companies to follow the Fair Information Practices developed by the U.S. government in 1973, but does not require adherence to those standards. In Canada, the Personal Information Protection and Electronic Documents Act (PIPEDA) was passed in 2000 to address the similar privacy concerns of their consumers. PIPEDA is based on the Fair Information Principles and requires that companies implement those principles. The Privacy Policy Rating System (PPRS) has been developed for this thesis as a method of rating company privacy policies for how they compare to the Fair Information Principles. Using both the PPRS content analysis technique and a standard stakeholder analysis technique, company privacy policies in both countries are examined to address the question of which government\u27s privacy policy is doing a better job of achieving the Fair Information Principles. The lessons learned in this comparison are used to formulate policy recommendations to improve U.S. privacy policy for better adherence among U.S. companies to the Fair Information Principles

    Anonymous reputation based reservations in e-commerce (AMNESIC)

    Get PDF
    Online reservation systems have grown over the last recent years to facilitate the purchase of goods and services. Generally, reservation systems require that customers provide some personal data to make a reservation effective. With this data, service providers can check the consumer history and decide if the user is trustable enough to get the reserve. Although the reputation of a user is a good metric to implement the access control of the system, providing personal and sensitive data to the system presents high privacy risks, since the interests of a user are totally known and tracked by an external entity. In this paper we design an anonymous reservation protocol that uses reputations to profile the users and control their access to the offered services, but at the same time it preserves their privacy not only from the seller but the service provider

    The Morality of Crisis Pregnancy Counseling

    Get PDF
    • …
    corecore