433 research outputs found

    The Anonymous Poster: How to Protect Internet Users’ Privacy and Prevent Abuse

    Get PDF
    The threat of anonymous Internet posting to individual privacy has been met with congressional and judicial indecisiveness. Part of the problem stems from the inherent conflict between punishing those who disrespect one\u27s privacy by placing a burden on the individual websites and continuing to support the Internet\u27s development. Additionally, assigning traditional tort liability is problematic as the defendant enjoys an expectation of privacy as well, creating difficulty in securing the necessary information to proceed with legal action. One solution to resolving invasion of privacy disputes involves a uniform identification verification program that ensures user confidentiality while promoting accountability for malicious behavior

    Preventing Location-Based Identity Inference in Anonymous Spatial Queries

    Get PDF
    The increasing trend of embedding positioning capabilities (for example, GPS) in mobile devices facilitates the widespread use of Location-Based Services. For such applications to succeed, privacy and confidentiality are essential. Existing privacy-enhancing techniques rely on encryption to safeguard communication channels, and on pseudonyms to protect user identities. Nevertheless, the query contents may disclose the physical location of the user. In this paper, we present a framework for preventing location-based identity inference of users who issue spatial queries to Location-Based Services. We propose transformations based on the well-established K-anonymity concept to compute exact answers for range and nearest neighbor search, without revealing the query source. Our methods optimize the entire process of anonymizing the requests and processing the transformed spatial queries. Extensive experimental studies suggest that the proposed techniques are applicable to real-life scenarios with numerous mobile users

    Using Metrics Suites to Improve the Measurement of Privacy in Graphs

    Get PDF
    The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link.Social graphs are widely used in research (e.g., epidemiology) and business (e.g., recommender systems). However, sharing these graphs poses privacy risks because they contain sensitive information about individuals. Graph anonymization techniques aim to protect individual users in a graph, while graph de-anonymization aims to re-identify users. The effectiveness of anonymization and de-anonymization algorithms is usually evaluated with privacy metrics. However, it is unclear how strong existing privacy metrics are when they are used in graph privacy. In this paper, we study 26 privacy metrics for graph anonymization and de-anonymization and evaluate their strength in terms of three criteria: monotonicity indicates whether the metric indicates lower privacy for stronger adversaries; for within-scenario comparisons, evenness indicates whether metric values are spread evenly; and for between-scenario comparisons, shared value range indicates whether metrics use a consistent value range across scenarios. Our extensive experiments indicate that no single metric fulfills all three criteria perfectly. We therefore use methods from multi-criteria decision analysis to aggregate multiple metrics in a metrics suite, and we show that these metrics suites improve monotonicity compared to the best individual metric. This important result enables more monotonic, and thus more accurate, evaluations of new graph anonymization and de-anonymization algorithms

    Contributions to Lifelogging Protection In Streaming Environments

    Get PDF
    Tots els dies, més de cinc mil milions de persones generen algun tipus de dada a través d'Internet. Per accedir a aquesta informació, necessitem utilitzar serveis de recerca, ja siguin motors de cerca web o assistents personals. A cada interacció amb ells, el nostre registre d'accions, logs, s'utilitza per oferir una millor experiència. Per a les empreses, també són molt valuosos, ja que ofereixen una forma de monetitzar el servei. La monetització s'aconsegueix venent dades a tercers, però, els logs de consultes podrien exposar informació confidencial de l'usuari (identificadors, malalties, tendències sexuals, creences religioses) o usar-se per al que es diu "life-logging ": Un registre continu de les activitats diàries. La normativa obliga a protegir aquesta informació. S'han proposat prèviament sistemes de protecció per a conjunts de dades tancats, la majoria d'ells treballant amb arxius atòmics o dades estructurades. Desafortunadament, aquests sistemes no s'adapten quan es fan servir en el creixent entorn de dades no estructurades en temps real que representen els serveis d'Internet. Aquesta tesi té com objectiu dissenyar tècniques per protegir la informació confidencial de l'usuari en un entorn no estructurat d’streaming en temps real, garantint un equilibri entre la utilitat i la protecció de dades. S'han fet tres propostes per a una protecció eficaç dels logs. La primera és un nou mètode per anonimitzar logs de consultes, basat en k-anonimat probabilística i algunes eines de desanonimització per determinar fuites de dades. El segon mètode, s'ha millorat afegint un equilibri configurable entre privacitat i usabilitat, aconseguint una gran millora en termes d'utilitat de dades. La contribució final es refereix als assistents personals basats en Internet. La informació generada per aquests dispositius es pot considerar "life-logging" i pot augmentar els riscos de privacitat de l'usuari. Es proposa un esquema de protecció que combina anonimat de logs i signatures sanitizables.Todos los días, más de cinco mil millones de personas generan algún tipo de dato a través de Internet. Para acceder a esa información, necesitamos servicios de búsqueda, ya sean motores de búsqueda web o asistentes personales. En cada interacción con ellos, nuestro registro de acciones, logs, se utiliza para ofrecer una experiencia más útil. Para las empresas, también son muy valiosos, ya que ofrecen una forma de monetizar el servicio, vendiendo datos a terceros. Sin embargo, los logs podrían exponer información confidencial del usuario (identificadores, enfermedades, tendencias sexuales, creencias religiosas) o usarse para lo que se llama "life-logging": Un registro continuo de las actividades diarias. La normativa obliga a proteger esta información. Se han propuesto previamente sistemas de protección para conjuntos de datos cerrados, la mayoría de ellos trabajando con archivos atómicos o datos estructurados. Desafortunadamente, esos sistemas no se adaptan cuando se usan en el entorno de datos no estructurados en tiempo real que representan los servicios de Internet. Esta tesis tiene como objetivo diseñar técnicas para proteger la información confidencial del usuario en un entorno no estructurado de streaming en tiempo real, garantizando un equilibrio entre utilidad y protección de datos. Se han hecho tres propuestas para una protección eficaz de los logs. La primera es un nuevo método para anonimizar logs de consultas, basado en k-anonimato probabilístico y algunas herramientas de desanonimización para determinar fugas de datos. El segundo método, se ha mejorado añadiendo un equilibrio configurable entre privacidad y usabilidad, logrando una gran mejora en términos de utilidad de datos. La contribución final se refiere a los asistentes personales basados en Internet. La información generada por estos dispositivos se puede considerar “life-logging” y puede aumentar los riesgos de privacidad del usuario. Se propone un esquema de protección que combina anonimato de logs y firmas sanitizables.Every day, more than five billion people generate some kind of data over the Internet. As a tool for accessing that information, we need to use search services, either in the form of Web Search Engines or through Personal Assistants. On each interaction with them, our record of actions via logs, is used to offer a more useful experience. For companies, logs are also very valuable since they offer a way to monetize the service. Monetization is achieved by selling data to third parties, however query logs could potentially expose sensitive user information: identifiers, sensitive data from users (such as diseases, sexual tendencies, religious beliefs) or be used for what is called ”life-logging”: a continuous record of one’s daily activities. Current regulations oblige companies to protect this personal information. Protection systems for closed data sets have previously been proposed, most of them working with atomic files or structured data. Unfortunately, those systems do not fit when used in the growing real-time unstructured data environment posed by Internet services. This thesis aims to design techniques to protect the user’s sensitive information in a non-structured real-time streaming environment, guaranteeing a trade-off between data utility and protection. In this regard, three proposals have been made in efficient log protection. The first is a new method to anonymize query logs, based on probabilistic k-anonymity and some de-anonymization tools to determine possible data leaks. A second method has been improved in terms of a configurable trade-off between privacy and usability, achieving a great improvement in terms of data utility. Our final contribution concerns Internet-based Personal Assistants. The information generated by these devices is likely to be considered life-logging, and it can increase the user’s privacy risks. The proposal is a protection scheme that combines log anonymization and sanitizable signatures

    Uncontrollable Privacy - The right that every attacker desires

    Get PDF
    The request of the Internet users enjoying privacy during their e-activities enforces the Internet society to develop techniques which offer privacy to the Internet users, known as Privacy Enhancing Technologies (PETs). Among the Internet users, there are attackers who desire more than anything else to enjoy privacy during their malicious actions, and a PET is what they were looking for. Thus, although a PET should offer privacy to the internet users, proper techniques should also be employed in order to help the victims during the investigation procedure and unveil the identification of the attackers. The paper summarizes the current design issues of PETs and introduces additional issues in order to offer forensic investigation services. To the best of our knowledge this is the first attempt which it proves (the obvious) that the existing PETs do not meet accountability requirements. By knowing explicitly the reasons the PETs are inefficient offering accountability, it is the most appropriate way to make PETs offering higher level of accountability without decreasing the level of the privacy offered

    The legality of online Privacy-Enhancing Technologies

    Full text link
    L’utilisation d’Internet prend beaucoup d’ampleur depuis quelques années et le commerce électronique connaît une hausse considérable. Nous pouvons présentement acheter facilement via Internet sans quitter notre domicile et avons accès à d’innombrables sources d’information. Cependant, la navigation sur Internet permet également la création de bases de données détaillées décrivant les habitudes de chaque utilisateur, informations ensuite utilisées par des tiers afin de cerner le profil de leur clientèle cible, ce qui inquiète plusieurs intervenants. Les informations concernant un individu peuvent être récoltées par l’interception de données transactionnelles, par l’espionnage en ligne, ainsi que par l’enregistrement d’adresses IP. Afin de résoudre les problèmes de vie privée et de s’assurer que les commerçants respectent la législation applicable en la matière, ainsi que les exigences mises de l’avant par la Commission européenne, plusieurs entreprises comme Zero-knowledge Systems Inc. et Anonymizer.com offrent des logiciels permettant la protection de la vie privée en ligne (privacy-enhancing technologies ou PETs). Ces programmes utilisent le cryptage d’information, une méthode rendant les données illisibles pour tous à l’exception du destinataire. L’objectif de la technologie utilisée a été de créer des systèmes mathématiques rigoureux pouvant empêcher la découverte de l’identité de l’auteur même par le plus déterminé des pirates, diminuant ainsi les risques de vol d’information ou la divulgation accidentelle de données confidentielles. Malgré le fait que ces logiciels de protection de la vie privée permettent un plus grand respect des Directives européennes en la matière, une analyse plus approfondie du sujet témoigne du fait que ces technologies pourraient être contraires aux lois concernant le cryptage en droit canadien, américain et français.The use of the Internet has spread widely in the past few years and commerce on the World Wide Web has boomed. We are now able to buy products easily from home over the Internet and have access to all kinds of information sources. The well-known concern is that browsing the Internet has created detailed databases describing each user's browsing patterns and that third parties are now able to assemble comprehensive profiles about online users. Information about the user is gathered through the collection of transactional data, Internet tracking, and tracking IP addresses. In order to solve privacy problems and make sure companies are obligated to comply with privacy laws, or more specifically with the standards established by the European Commission, many companies, like Zero-knowledge Systems Inc. and Anonymizer.com, are marketing privacy-enhancing technologies (PETs) in order to protect and assure the privacy of the individual in the digital world. These privacy-enhancing technologies use a method called encryption, which scrambles the data, making it illegible to everyone except the intended recipient. The goal has been to create mathematically rigorous systems that will prevent even the most determined attackers from discovering the user's identity, therefore significantly reducing the risk of data theft or accidental leaks of sensitive information from the Internet user’s computer. While these online privacy software do help to protect the privacy of the Internet users in making sure that data collectors comply with the European Privacy Directives, a further analysis may determine that they are illegal according to Canadian, American or French encryption control laws and regulations
    corecore