166 research outputs found

    A Cryptographic Escrow for Treaty Declarations and Step-by-Step Verification

    Full text link
    The verification of arms-control and disarmament agreements requires states to provide declarations, including information on sensitive military sites and assets. There are important cases, however, where negotiations of these agreements are impeded because states are reluctant to provide any such data, because of concerns about prematurely handing over militarily significant information. To address this challenge, we present a cryptographic escrow that allows a state to make a complete declaration of sites and assets at the outset and commit to its content, but only reveal the sensitive information therein sequentially. Combined with an inspection regime, our escrow allows for step-by-step verification of the correctness and completeness of the initial declaration so that the information release and inspections keep pace with parallel diplomatic and political processes. We apply this approach to the possible denuclearization of North Korea. Such approach can be applied, however, to any agreement requiring the sharing of sensitive information.Comment: 14 pages, 4 figure

    Summary of Expert Testimony

    Get PDF

    Government Data and the Invisible Hand

    Get PDF
    If President Barack Obama\u27s new administration really wants to embrace the potential of Internet-enabled government transparency, it should follow a counter-intuitive but ultimately compelling strategy: reduce the federal role in presenting important government information to citizens. Today, government bodies consider their own Web sites to be a higher priority than technical infrastructures that open up their data for others to use. We argue that this understanding is a mistake. It would be preferable for government to understand providing reusable data, rather than providing Web sites, as the core of its online publishing responsibility. During the presidential campaign, all three major candidates indicated that they thought the federal government could make better use of the Internet. Barack Obama\u27s platform went the furthest and explicitly endorsed maling government data available online in universally accessible formats. Hillary Clinton, meanwhile, remarked that she wanted to see much more government information online. John McCain\u27s platform called for a new Office of Electronic Government. But the situation to which these candidates were responding-the wide gap between the exciting uses of Internet technology by private parties, on the one hand, and the government\u27s lagging technical infrastructure, on the other-is not new. A minefield of federal rules and a range of other factors, prevent government Web masters from keeping pace with the evergrowing potential of the Internet

    Buying Time: Latency Racing vs. Bidding in Fair Transaction Ordering

    Full text link
    We design a practical algorithm for transaction ordering that takes into account both transaction timestamps and bids. The algorithm guarantees that users get their transactions published with bounded delay against a bid, while it extracts a fair value from sophisticated users that have an edge in latency, by moving expenditure from investment in latency improvement technology to bidding. The algorithm creates a score from timestamps and bids, and orders transactions based on the score. We first show that a scoring rule is the only type of rule that satisfies the independence of latency races. We provide an economic analysis of the protocol in an environment of private information, where investment in latency is made ex-ante or interim stages, while bidding happens at the interim stage where private signals have been observed. The algorithm is useful for transaction sequencing in rollups or in other environments where the sequencer has privileged access to order flows

    Mixcoin Anonymity for Bitcoin with accountable mixes (Full version)

    Get PDF
    Abstract. We propose Mixcoin, a protocol to facilitate anonymous payments in Bitcoin and similar cryptocurrencies. We build on the emergent phenomenon of currency mixes, adding an accountability mechanism to expose theft. We demonstrate that incentives of mixes and clients can be aligned to ensure that rational mixes will not steal. Our scheme is efficient and fully compatible with Bitcoin. Against a passive attacker, our scheme provides an anonymity set of all other users mixing coins contemporaneously. This is an interesting new property with no clear analog in better-studied communication mixes. Against active attackers our scheme offers similar anonymity to traditional communication mixes.

    Accountable Algorithms

    Get PDF
    Many important decisions historically made by people are now made by computers. Algorithms count votes, approve loan and credit card applications, target citizens or neighborhoods for police scrutiny, select taxpayers for IRS audit, grant or deny immigration visas, and more. The accountability mechanisms and legal standards that govern such decision processes have not kept pace with technology. The tools currently available to policymakers, legislators, and courts were developed to oversee human decisionmakers and often fail when applied to computers instead. For example, how do you judge the intent of a piece of software? Because automated decision systems can return potentially incorrect, unjustified, or unfair results, additional approaches are needed to make such systems accountable and governable. This Article reveals a new technological toolkit to verify that automated decisions comply with key standards of legal fairness. We challenge the dominant position in the legal literature that transparency will solve these problems. Disclosure of source code is often neither necessary (because of alternative techniques from computer science) nor sufficient (because of the issues analyzing code) to demonstrate the fairness of a process. Furthermore, transparency may be undesirable, such as when it discloses private information or permits tax cheats or terrorists to game the systems determining audits or security screening. The central issue is how to assure the interests of citizens, and society as a whole, in making these processes more accountable. This Article argues that technology is creating new opportunities—subtler and more flexible than total transparency—to design decisionmaking algorithms so that they better align with legal and policy objectives. Doing so will improve not only the current governance of automated decisions, but also—in certain cases—the governance of decisionmaking in general. The implicit (or explicit) biases of human decisionmakers can be difficult to find and root out, but we can peer into the “brain” of an algorithm: computational processes and purpose specifications can be declared prior to use and verified afterward. The technological tools introduced in this Article apply widely. They can be used in designing decisionmaking processes from both the private and public sectors, and they can be tailored to verify different characteristics as desired by decisionmakers, regulators, or the public. By forcing a more careful consideration of the effects of decision rules, they also engender policy discussions and closer looks at legal standards. As such, these tools have far-reaching implications throughout law and society. Part I of this Article provides an accessible and concise introduction to foundational computer science techniques that can be used to verify and demonstrate compliance with key standards of legal fairness for automated decisions without revealing key attributes of the decisions or the processes by which the decisions were reached. Part II then describes how these techniques can assure that decisions are made with the key governance attribute of procedural regularity, meaning that decisions are made under an announced set of rules consistently applied in each case. We demonstrate how this approach could be used to redesign and resolve issues with the State Department’s diversity visa lottery. In Part III, we go further and explore how other computational techniques can assure that automated decisions preserve fidelity to substantive legal and policy choices. We show how these tools may be used to assure that certain kinds of unjust discrimination are avoided and that automated decision processes behave in ways that comport with the social or legal standards that govern the decision. We also show how automated decisionmaking may even complicate existing doctrines of disparate treatment and disparate impact, and we discuss some recent computer science work on detecting and removing discrimination in algorithms, especially in the context of big data and machine learning. And lastly, in Part IV, we propose an agenda to further synergistic collaboration between computer science, law, and policy to advance the design of automated decision processes for accountabilit

    CONIKS: Bringing Key Transparency to End Users

    Get PDF
    We present CONIKS, an end-user key verification service capable of integration in end-to-end encrypted communication systems. CONIKS builds on transparency log proposals for web server certificates but solves several new challenges specific to key verification for end users. CONIKS obviates the need for global third-party monitors and enables users to efficiently monitor their own key bindings for consistency, downloading less than 20 kB per day to do so even for a provider with billions of users. CONIKS users and providers can collectively audit providers for non-equivocation, and this requires downloading a constant 2.5 kB per provider per day. Additionally, CONIKS preserves the level of privacy offered by today’s major communication services, hiding the list of usernames present and even allowing providers to conceal the total number of users in the system

    Research Perspectives and Challenges for Bitcoin and Cryptocurrencies

    Get PDF
    Bitcoin has emerged as the most successful cryptographic currency in history. Within two years of its quiet launch in 2009, Bitcoin grew to comprise billions of dollars of economic value, even while the body of published research and security analysis justifying the system\u27s design was negligible. In the ensuing years, a growing literature has identified hidden-but-important properties of the system, discovered attacks, proposed promising alternatives, and singled out difficult future challenges. This interest has been complemented by a large and vibrant community of open-source developers who steward the system, while proposing and deploying numerous modifications and extensions. We provide the first systematic exposition of the second generation of cryptocurrencies, including Bitcoin and the many alternatives that have been implemented as alternate protocols or ``altcoins.\u27\u27 Drawing from a scattered body of knowledge, we put forward three key components of Bitcoin\u27s design that can be decoupled, enabling a more insightful analysis of Bitcoin\u27s properties and its proposed modifications and extensions. We contextualize the literature into five central properties capturing blockchain stability. We map the design space for numerous proposed modification, providing comparative analyses for alternative consensus mechanisms, currency allocation mechanisms, computational puzzles, and key management tools. We focus on anonymity issues in Bitcoin and provide an evaluation framework for analyzing a variety of proposals for enhancing unlinkability. Finally we provide new insights on a what we term disintermediation protocols, which absolve the need for trusted intermediaries in an interesting set of applications. We identify three general disintermediation strategies and provide a detailed comparative cost analysis

    Detection of missense mutations by single-strand conformational polymorphism (SSCP) analysis in five dysfunctional variants of coagulation factor VII

    Get PDF
    Five unrelated subjects with dysfunctional coagulation factor VII (FVII) were studied In order to Identify missense mutations affecting function. Exons 2 to 8 and the Intron-exon Junctions of their FVIl genes were amplified from peripheral white blood cell DNA by PCR and screened by SSCP analysis. DNA fragments showing aberrant mobility were sequenced. The following mutations were Identified: In case 1 (FVII: C <1%, FVIl:Ag 18%) a heterozygous A to G transltion at nucleotlde 8915 In exon 6 results In the amlno acid substitution Lys-137 to Glu near the C-termlnus of the FVlla llght chaln; In case 2 (FVII: C 7%, FVll:Ag 47%) a heterozygous A to G transltion at nucleotide 7834 In exon 5 results in the substitution of Gin-100 by Arg in the second EGF-like domain; In case 3 (FVll:C 20%, FVIl:Ag 76%) a homozygous G to A transition at nucleotide position 6055 in exon 4 was detected resulting in substitution of Arg-79 by Gin in the first EGF-like domain; in case 5 (FVIl:C 10%, FVIl:Ag 52%) a heterozygous C to T transition at nucleotide position 6054 in exon 4 also results in the substitution of Arg79, but in this case it is replaced by Trp; case 4 (FVll:C <1%, FVIl:Ag 100%) was homozygous for a previously reported mutation (G to A) at nucleotide position 10715 in exon 8, substituting Gin for Arg at position 304 in the protease domain. Cases 1,2 and 5 evidently have additional undetected mutation
    • …
    corecore