1,644 research outputs found

    Oblivion: Mitigating Privacy Leaks by Controlling the Discoverability of Online Information

    Get PDF
    Search engines are the prevalently used tools to collect information about individuals on the Internet. Search results typically comprise a variety of sources that contain personal information -- either intentionally released by the person herself, or unintentionally leaked or published by third parties, often with detrimental effects on the individual's privacy. To grant individuals the ability to regain control over their disseminated personal information, the European Court of Justice recently ruled that EU citizens have a right to be forgotten in the sense that indexing systems, must offer them technical means to request removal of links from search results that point to sources violating their data protection rights. As of now, these technical means consist of a web form that requires a user to manually identify all relevant links upfront and to insert them into the web form, followed by a manual evaluation by employees of the indexing system to assess if the request is eligible and lawful. We propose a universal framework Oblivion to support the automation of the right to be forgotten in a scalable, provable and privacy-preserving manner. First, Oblivion enables a user to automatically find and tag her disseminated personal information using natural language processing and image recognition techniques and file a request in a privacy-preserving manner. Second, Oblivion provides indexing systems with an automated and provable eligibility mechanism, asserting that the author of a request is indeed affected by an online resource. The automated ligibility proof ensures censorship-resistance so that only legitimately affected individuals can request the removal of corresponding links from search results. We have conducted comprehensive evaluations, showing that Oblivion is capable of handling 278 removal requests per second, and is hence suitable for large-scale deployment

    The subgraph homeomorphism problem

    Get PDF
    AbstractWe investigate the problem of finding a homeomorphic image of a “pattern” graph H in a larger input graph G. We view this problem as finding specified sets of edge disjoint or node disjoint paths in G. Our main result is a linear time algorithm to determine if there exists a simple cycle containing three given nodes in G (here H is a triangle). No polynomial time algorithm for this problem was previously known. We also discuss a variety of reductions between related versions of this problem and a number of open problems

    List Processing in Real Time on a Serial Computer

    Get PDF
    Key Words and Phrases: real-time, compacting, garbage collection, list processing, virtual memory, file or database management, storage management, storage allocation, LISP, CDR-coding, reference counting. CR Categories: 3.50, 3.60, 373, 3.80, 4.13, 24.32, 433, 4.35, 4.49 This report describes research done at the Artificial Intelligence Laboratory of the Massachusetts Institute of Technology. Support for the laboratory's artificial intelligence research is provided in part by the Advanced Research Projects Agency of the Department of Defense under Office of Naval Research contract N00014-75-C-0522.A real-time list processing system is one in which the time required by each elementary list operation (CONS, CAR, CDR, RPLACA, RPLACD, EQ, and ATOM in LISP) is bounded by a (small) constant. Classical list processing systems such as LISP do not have this property because a call to CONS may invoke the garbage collector which requires time proportional to the number of accessible cells to finish. The space requirement of a classical LISP system with N accessible cells under equilibrium conditions is (1.5+μ)N or (1+μ)N, depending upon whether a stack is required for the garbage collector, where μ>0 is typically less than 2. A list processing system is presented which: 1) is real-time--i.e. T(CONS) is bounded by a constant independent of the number of cells in use; 2) requires space (2+2μ)N, i.e. not more than twice that of a classical system; 3) runs on a serial computer without a time-sharing clock; 4) handles directed cycles in the data structures; 5) is fast--the average time for each operation is about the same as with normal garbage collection; 6) compacts--minimizes the working set; 7) keeps the free pool in one contiguous block--objects of nonuniform size pose no problem; 8) uses one phase incremental collection--no separate mark, sweep, relocate phases; 9) requires no garbage collector stack; 10) requires no "mark bits", per se; 11) is simple--suitable for microcoded implementation. Extensions of the system to handle a user program stack, compact list representation ("CDR-coding"), arrays of non-uniform size, and hash linking are discussed. CDR-coding is shown to reduce memory requirements for N LISP cells to ≈(I+μ)N. Our system is also compared with another approach to the real-time storage management problem, reference counting, and reference counting is shown to be neither competitive with our system when speed of allocation is critical, nor compatible, in the sense that a system with both forms of garbage collection is worse than our pure one.MIT Artificial Intelligence Laboratory Department of Defense Advanced Research Projects Agenc

    Multiple‐systems analysis for the quantification of modern slavery: classical and Bayesian approaches

    Get PDF
    Multiple systems estimation is a key approach for quantifying hidden populations such as the number of victims of modern slavery. The UK Government published an estimate of 10,000 to 13,000 victims, constructed by the present author, as part of the strategy leading to the Modern Slavery Act 2015. This estimate was obtained by a stepwise multiple systems method based on six lists. Further investigation shows that a small proportion of the possible models give rather different answers, and that other model fitting approaches may choose one of these. Three data sets collected in the field of modern slavery, together with a data set about the death toll in the Kosovo conflict, are used to investigate the stability and robustness of various multiple systems estimate methods. The crucial aspect is the way that interactions between lists are modelled, because these can substantially affect the results. Model selection and Bayesian approaches are considered in detail, in particular to assess their stability and robustness when applied to real modern slavery data. A new Markov Chain Monte Carlo Bayesian approach is developed; overall, this gives robust and stable results at least for the examples considered. The software and datasets are freely and publicly available to facilitate wider implementation and further research

    Analysis of common attacks in LDPCC-based public-key cryptosystems

    Get PDF
    We analyze the security and reliability of a recently proposed class of public-key cryptosystems against attacks by unauthorized parties who have acquired partial knowledge of one or more of the private key components and/or of the plaintext. Phase diagrams are presented, showing critical partial knowledge levels required for unauthorized decryptionComment: 14 pages, 6 figure

    Quantum-noise--randomized data-encryption for WDM fiber-optic networks

    Full text link
    We demonstrate high-rate randomized data-encryption through optical fibers using the inherent quantum-measurement noise of coherent states of light. Specifically, we demonstrate 650Mbps data encryption through a 10Gbps data-bearing, in-line amplified 200km-long line. In our protocol, legitimate users (who share a short secret-key) communicate using an M-ry signal set while an attacker (who does not share the secret key) is forced to contend with the fundamental and irreducible quantum-measurement noise of coherent states. Implementations of our protocol using both polarization-encoded signal sets as well as polarization-insensitive phase-keyed signal sets are experimentally and theoretically evaluated. Different from the performance criteria for the cryptographic objective of key generation (quantum key-generation), one possible set of performance criteria for the cryptographic objective of data encryption is established and carefully considered.Comment: Version 2: Some errors have been corrected and arguments refined. To appear in Physical Review A. Version 3: Minor corrections to version

    Picture-Hanging Puzzles

    Get PDF
    We show how to hang a picture by wrapping rope around n nails, making a polynomial number of twists, such that the picture falls whenever any k out of the n nails get removed, and the picture remains hanging when fewer than k nails get removed. This construction makes for some fun mathematical magic performances. More generally, we characterize the possible Boolean functions characterizing when the picture falls in terms of which nails get removed as all monotone Boolean functions. This construction requires an exponential number of twists in the worst case, but exponential complexity is almost always necessary for general functions.Comment: 18 pages, 8 figures, 11 puzzles. Journal version of FUN 2012 pape

    Quantum Physics and Computers

    Get PDF
    Recent theoretical results confirm that quantum theory provides the possibility of new ways of performing efficient calculations. The most striking example is the factoring problem. It has recently been shown that computers that exploit quantum features could factor large composite integers. This task is believed to be out of reach of classical computers as soon as the number of digits in the number to factor exceeds a certain limit. The additional power of quantum computers comes from the possibility of employing a superposition of states, of following many distinct computation paths and of producing a final output that depends on the interference of all of them. This ``quantum parallelism'' outstrips by far any parallelism that can be thought of in classical computation and is responsible for the ``exponential'' speed-up of computation. This is a non-technical (or at least not too technical) introduction to the field of quantum computation. It does not cover very recent topics, such as error-correction.Comment: 27 pages, LaTeX, 8 PostScript figures embedded. A bug in one of the postscript files has been fixed. Reprints available from the author. The files are also available from http://eve.physics.ox.ac.uk/Articles/QC.Articles.htm

    Public Evidence from Secret Ballots

    Full text link
    Elections seem simple---aren't they just counting? But they have a unique, challenging combination of security and privacy requirements. The stakes are high; the context is adversarial; the electorate needs to be convinced that the results are correct; and the secrecy of the ballot must be ensured. And they have practical constraints: time is of the essence, and voting systems need to be affordable and maintainable, and usable by voters, election officials, and pollworkers. It is thus not surprising that voting is a rich research area spanning theory, applied cryptography, practical systems analysis, usable security, and statistics. Election integrity involves two key concepts: convincing evidence that outcomes are correct and privacy, which amounts to convincing assurance that there is no evidence about how any given person voted. These are obviously in tension. We examine how current systems walk this tightrope.Comment: To appear in E-Vote-Id '1

    Picture-hanging puzzles

    Get PDF
    We show how to hang a picture by wrapping rope around n nails, making a polynomial number of twists, such that the picture falls whenever any k out of the n nails get removed, and the picture remains hanging when fewer than k nails get removed. This construction makes for some fun mathematical magic performances. More generally, we characterize the possible Boolean functions characterizing when the picture falls in terms of which nails get removed as all monotone Boolean functions. This construction requires an exponential number of twists in the worst case, but exponential complexity is almost always necessary for general functions.National Science Foundation (U.S.) (NSF grant CCF-1018388
    corecore