10,816 research outputs found

    The Precautionary Principle (with Application to the Genetic Modification of Organisms)

    Full text link
    We present a non-naive version of the Precautionary (PP) that allows us to avoid paranoia and paralysis by confining precaution to specific domains and problems. PP is intended to deal with uncertainty and risk in cases where the absence of evidence and the incompleteness of scientific knowledge carries profound implications and in the presence of risks of "black swans", unforeseen and unforeseable events of extreme consequence. We formalize PP, placing it within the statistical and probabilistic structure of ruin problems, in which a system is at risk of total failure, and in place of risk we use a formal fragility based approach. We make a central distinction between 1) thin and fat tails, 2) Local and systemic risks and place PP in the joint Fat Tails and systemic cases. We discuss the implications for GMOs (compared to Nuclear energy) and show that GMOs represent a public risk of global harm (while harm from nuclear energy is comparatively limited and better characterized). PP should be used to prescribe severe limits on GMOs

    Design of Geometric Molecular Bonds

    Full text link
    An example of a nonspecific molecular bond is the affinity of any positive charge for any negative charge (like-unlike), or of nonpolar material for itself when in aqueous solution (like-like). This contrasts specific bonds such as the affinity of the DNA base A for T, but not for C, G, or another A. Recent experimental breakthroughs in DNA nanotechnology demonstrate that a particular nonspecific like-like bond ("blunt-end DNA stacking" that occurs between the ends of any pair of DNA double-helices) can be used to create specific "macrobonds" by careful geometric arrangement of many nonspecific blunt ends, motivating the need for sets of macrobonds that are orthogonal: two macrobonds not intended to bind should have relatively low binding strength, even when misaligned. To address this need, we introduce geometric orthogonal codes that abstractly model the engineered DNA macrobonds as two-dimensional binary codewords. While motivated by completely different applications, geometric orthogonal codes share similar features to the optical orthogonal codes studied by Chung, Salehi, and Wei. The main technical difference is the importance of 2D geometry in defining codeword orthogonality.Comment: Accepted to appear in IEEE Transactions on Molecular, Biological, and Multi-Scale Communication

    Research Priorities for Robust and Beneficial Artificial Intelligence

    Get PDF
    Success in the quest for artificial intelligence has the potential to bring unprecedented benefits to humanity, and it is therefore worthwhile to investigate how to maximize these benefits while avoiding potential pitfalls. This article gives numerous examples (which should by no means be construed as an exhaustive list) of such worthwhile research aimed at ensuring that AI remains robust and beneficial.Comment: This article gives examples of the type of research advocated by the open letter for robust & beneficial AI at http://futureoflife.org/ai-open-lette

    Quasimarket failure

    Get PDF
    The efficiency of “quasimarkets”—decentralized public goods provision subjected to Tiebout competition—is a staple of public choice conventional wisdom. Yet in the 1990s a countermovement in political economy called “neoconsolidationism” began to challenge this wisdom. The neoconsolidationists use the logic of government failure central to public choice economics to argue that quasimarkets fail and that jurisdictional consolidation is a superior way to supply public goods and services in metropolitan areas. Public choice scholars have largely ignored the neoconsolidationists’ challenge. This paper brings that challenge to public choice scholars’ attention with the hope of encouraging responses. It also offers some preliminary thoughts about the directions such responses might take.Public Goods; Quasimarkets

    A Unifying Model for External Noise Sources and ISI in Diffusive Molecular Communication

    Full text link
    This paper considers the impact of external noise sources, including interfering transmitters, on a diffusive molecular communication system, where the impact is measured as the number of noise molecules expected to be observed at a passive receiver. A unifying model for noise, multiuser interference, and intersymbol interference is presented, where, under certain circumstances, interference can be approximated as a noise source that is emitting continuously. The model includes the presence of advection and molecule degradation. The time-varying and asymptotic impact is derived for a series of special cases, some of which facilitate closed-form solutions. Simulation results show the accuracy of the expressions derived for the impact of a continuously-emitting noise source, and show how approximating intersymbol interference as a noise source can simplify the calculation of the expected bit error probability of a weighted sum detector.Comment: 14 pages, 7 figures, 4 tables, 1 appendix. To appear in IEEE Journal on Selected Areas in Communications (JSAC). Submitted October 21, 2013, revised April 21, 2014, accepted June 3, 201

    Contrasting Views of Complexity and Their Implications For Network-Centric Infrastructures

    Get PDF
    There exists a widely recognized need to better understand and manage complex “systems of systems,” ranging from biology, ecology, and medicine to network-centric technologies. This is motivating the search for universal laws of highly evolved systems and driving demand for new mathematics and methods that are consistent, integrative, and predictive. However, the theoretical frameworks available today are not merely fragmented but sometimes contradictory and incompatible. We argue that complexity arises in highly evolved biological and technological systems primarily to provide mechanisms to create robustness. However, this complexity itself can be a source of new fragility, leading to “robust yet fragile” tradeoffs in system design. We focus on the role of robustness and architecture in networked infrastructures, and we highlight recent advances in the theory of distributed control driven by network technologies. This view of complexity in highly organized technological and biological systems is fundamentally different from the dominant perspective in the mainstream sciences, which downplays function, constraints, and tradeoffs, and tends to minimize the role of organization and design
    corecore