677 research outputs found

    A Cryptographic Escrow for Treaty Declarations and Step-by-Step Verification

    Full text link
    The verification of arms-control and disarmament agreements requires states to provide declarations, including information on sensitive military sites and assets. There are important cases, however, where negotiations of these agreements are impeded because states are reluctant to provide any such data, because of concerns about prematurely handing over militarily significant information. To address this challenge, we present a cryptographic escrow that allows a state to make a complete declaration of sites and assets at the outset and commit to its content, but only reveal the sensitive information therein sequentially. Combined with an inspection regime, our escrow allows for step-by-step verification of the correctness and completeness of the initial declaration so that the information release and inspections keep pace with parallel diplomatic and political processes. We apply this approach to the possible denuclearization of North Korea. Such approach can be applied, however, to any agreement requiring the sharing of sensitive information.Comment: 14 pages, 4 figure

    Summary of Expert Testimony

    Get PDF

    Government Data and the Invisible Hand

    Get PDF
    If President Barack Obama\u27s new administration really wants to embrace the potential of Internet-enabled government transparency, it should follow a counter-intuitive but ultimately compelling strategy: reduce the federal role in presenting important government information to citizens. Today, government bodies consider their own Web sites to be a higher priority than technical infrastructures that open up their data for others to use. We argue that this understanding is a mistake. It would be preferable for government to understand providing reusable data, rather than providing Web sites, as the core of its online publishing responsibility. During the presidential campaign, all three major candidates indicated that they thought the federal government could make better use of the Internet. Barack Obama\u27s platform went the furthest and explicitly endorsed maling government data available online in universally accessible formats. Hillary Clinton, meanwhile, remarked that she wanted to see much more government information online. John McCain\u27s platform called for a new Office of Electronic Government. But the situation to which these candidates were responding-the wide gap between the exciting uses of Internet technology by private parties, on the one hand, and the government\u27s lagging technical infrastructure, on the other-is not new. A minefield of federal rules and a range of other factors, prevent government Web masters from keeping pace with the evergrowing potential of the Internet

    Buying Time: Latency Racing vs. Bidding in Fair Transaction Ordering

    Full text link
    We design a practical algorithm for transaction ordering that takes into account both transaction timestamps and bids. The algorithm guarantees that users get their transactions published with bounded delay against a bid, while it extracts a fair value from sophisticated users that have an edge in latency, by moving expenditure from investment in latency improvement technology to bidding. The algorithm creates a score from timestamps and bids, and orders transactions based on the score. We first show that a scoring rule is the only type of rule that satisfies the independence of latency races. We provide an economic analysis of the protocol in an environment of private information, where investment in latency is made ex-ante or interim stages, while bidding happens at the interim stage where private signals have been observed. The algorithm is useful for transaction sequencing in rollups or in other environments where the sequencer has privileged access to order flows

    Mixcoin Anonymity for Bitcoin with accountable mixes (Full version)

    Get PDF
    Abstract. We propose Mixcoin, a protocol to facilitate anonymous payments in Bitcoin and similar cryptocurrencies. We build on the emergent phenomenon of currency mixes, adding an accountability mechanism to expose theft. We demonstrate that incentives of mixes and clients can be aligned to ensure that rational mixes will not steal. Our scheme is efficient and fully compatible with Bitcoin. Against a passive attacker, our scheme provides an anonymity set of all other users mixing coins contemporaneously. This is an interesting new property with no clear analog in better-studied communication mixes. Against active attackers our scheme offers similar anonymity to traditional communication mixes.

    The Seyfert Population in the Local Universe

    Full text link
    The magnitude-limited catalog of the Southern Sky Redshift Survey (SSRS2), is used to characterize the properties of galaxies hosting Active Galactic Nuclei. Using emission-line ratios, we identify a total of 162 (3%) Seyfert galaxies out of the parent sample with 5399 galaxies. The sample contains 121 Seyfert 2 galaxies and 41 Seyfert 1. The SSRS2 Seyfert galaxies are predominantly in spirals of types Sb and earlier, or in galaxies with perturbed appearance as the result of strong interactions or mergers. Seyfert galaxies in this sample are twice as common in barred hosts than the non-Seyferts. By assigning galaxies to groups using a percolation algorithm we find that the Seyfert galaxies in the SSRS2 are more likely to be found in binary systems, when compared to galaxies in the SSRS2 parent sample. However, there is no statistically significant difference between the Seyfert and SSRS2 parent sample when systems with more than 2 galaxies are considered. The analysis of the present sample suggests that there is a stronger correlation between the presence of the AGN phenomenon with internal properties of galaxies (morphology, presence of bar, luminosity) than with environmental effects (local galaxy density, group velocity dispersion, nearest neighbor distance).Comment: 35 pages, 13 figures, Accepted to be publised in Astronomical Journa

    Accountable Algorithms

    Get PDF
    Many important decisions historically made by people are now made by computers. Algorithms count votes, approve loan and credit card applications, target citizens or neighborhoods for police scrutiny, select taxpayers for IRS audit, grant or deny immigration visas, and more. The accountability mechanisms and legal standards that govern such decision processes have not kept pace with technology. The tools currently available to policymakers, legislators, and courts were developed to oversee human decisionmakers and often fail when applied to computers instead. For example, how do you judge the intent of a piece of software? Because automated decision systems can return potentially incorrect, unjustified, or unfair results, additional approaches are needed to make such systems accountable and governable. This Article reveals a new technological toolkit to verify that automated decisions comply with key standards of legal fairness. We challenge the dominant position in the legal literature that transparency will solve these problems. Disclosure of source code is often neither necessary (because of alternative techniques from computer science) nor sufficient (because of the issues analyzing code) to demonstrate the fairness of a process. Furthermore, transparency may be undesirable, such as when it discloses private information or permits tax cheats or terrorists to game the systems determining audits or security screening. The central issue is how to assure the interests of citizens, and society as a whole, in making these processes more accountable. This Article argues that technology is creating new opportunities—subtler and more flexible than total transparency—to design decisionmaking algorithms so that they better align with legal and policy objectives. Doing so will improve not only the current governance of automated decisions, but also—in certain cases—the governance of decisionmaking in general. The implicit (or explicit) biases of human decisionmakers can be difficult to find and root out, but we can peer into the “brain” of an algorithm: computational processes and purpose specifications can be declared prior to use and verified afterward. The technological tools introduced in this Article apply widely. They can be used in designing decisionmaking processes from both the private and public sectors, and they can be tailored to verify different characteristics as desired by decisionmakers, regulators, or the public. By forcing a more careful consideration of the effects of decision rules, they also engender policy discussions and closer looks at legal standards. As such, these tools have far-reaching implications throughout law and society. Part I of this Article provides an accessible and concise introduction to foundational computer science techniques that can be used to verify and demonstrate compliance with key standards of legal fairness for automated decisions without revealing key attributes of the decisions or the processes by which the decisions were reached. Part II then describes how these techniques can assure that decisions are made with the key governance attribute of procedural regularity, meaning that decisions are made under an announced set of rules consistently applied in each case. We demonstrate how this approach could be used to redesign and resolve issues with the State Department’s diversity visa lottery. In Part III, we go further and explore how other computational techniques can assure that automated decisions preserve fidelity to substantive legal and policy choices. We show how these tools may be used to assure that certain kinds of unjust discrimination are avoided and that automated decision processes behave in ways that comport with the social or legal standards that govern the decision. We also show how automated decisionmaking may even complicate existing doctrines of disparate treatment and disparate impact, and we discuss some recent computer science work on detecting and removing discrimination in algorithms, especially in the context of big data and machine learning. And lastly, in Part IV, we propose an agenda to further synergistic collaboration between computer science, law, and policy to advance the design of automated decision processes for accountabilit

    FORS spectroscopy of galaxies in the Hubble Deep Field South

    Full text link
    We present low resolution multi-object spectroscopy of an I-band magnitude limited (I_{AB} ~ 23--23.5) sample of galaxies located in an area centered on the Hubble Deep Field-South (HDFS). The observations were obtained using the Focal Reducer low dispersion Spectrograph (FORS) on the ESO Very Large Telescope. Thirty-two primary spectroscopic targets in the HST-WFPC2 HDFS were supplemented with galaxies detected in the Infrared Space Observatory's survey of the HDFS and the ESO Imaging Deep Survey to comprise a sample of 100 galaxies for spectroscopic observations. Based on detections of several emission lines, such as [OII]3727, H_beta and [OIII]5007, or other spectroscopic features, we have measured accurate redshifts for 50 objects in the central HDFS and flanking fields. The redshift range of the current sample of galaxies is 0.6--1.2, with a median redshift of 1.13 (at I ~ 23.5 not corrected for completeness). The sample is dominated by starburst galaxies with only a small fraction of ellipticals (~10%). For the emission line objects, the extinction corrected [OII]3727 line strengths yield estimates of star formation rates in the range 0.5--30 M_solar/yr. We have used the present data to derive the [OII]3727 luminosity function up to redshift of 1.2. When combined with [OII]3727 luminosity densities for the local and high redshift Universe, our results confirm the steep rise in the star formation rate (SFR) to z ~ 1.3.Comment: Tables 2 and 3 provided as separate files. Accepted for publication by Astronomy and Astrophysic

    The Age-Redshift Relation for Standard Cosmology

    Full text link
    We present compact, analytic expressions for the age-redshift relation τ(z)\tau(z) for standard Friedmann-Lema\^ \itre-Robertson-Walker (FLRW) cosmology. The new expressions are given in terms of incomplete Legendre elliptic integrals and evaluate much faster than by direct numerical integration.Comment: 13 pages, 3 figure
    corecore