87 research outputs found

    Gaussian Quantum Information

    Get PDF
    The science of quantum information has arisen over the last two decades centered on the manipulation of individual quanta of information, known as quantum bits or qubits. Quantum computers, quantum cryptography and quantum teleportation are among the most celebrated ideas that have emerged from this new field. It was realized later on that using continuous-variable quantum information carriers, instead of qubits, constitutes an extremely powerful alternative approach to quantum information processing. This review focuses on continuous-variable quantum information processes that rely on any combination of Gaussian states, Gaussian operations, and Gaussian measurements. Interestingly, such a restriction to the Gaussian realm comes with various benefits, since on the theoretical side, simple analytical tools are available and, on the experimental side, optical components effecting Gaussian processes are readily available in the laboratory. Yet, Gaussian quantum information processing opens the way to a wide variety of tasks and applications, including quantum communication, quantum cryptography, quantum computation, quantum teleportation, and quantum state and channel discrimination. This review reports on the state of the art in this field, ranging from the basic theoretical tools and landmark experimental realizations to the most recent successful developments.Comment: 51 pages, 7 figures, submitted to Reviews of Modern Physic

    Cryptography for Bitcoin and friends

    Get PDF
    Numerous cryptographic extensions to Bitcoin have been proposed since Satoshi Nakamoto introduced the revolutionary design in 2008. However, only few proposals have been adopted in Bitcoin and other prevalent cryptocurrencies, whose resistance to fundamental changes has proven to grow with their success. In this dissertation, we introduce four cryptographic techniques that advance the functionality and privacy provided by Bitcoin and similar cryptocurrencies without requiring fundamental changes in their design: First, we realize smart contracts that disincentivize parties in distributed systems from making contradicting statements by penalizing such behavior by the loss of funds in a cryptocurrency. Second, we propose CoinShuffle++, a coin mixing protocol which improves the anonymity of cryptocurrency users by combining their transactions and thereby making it harder for observers to trace those transactions. The core of CoinShuffle++ is DiceMix, a novel and efficient protocol for broadcasting messages anonymously without the help of any trusted third-party anonymity proxies and in the presence of malicious participants. Third, we combine coin mixing with the existing idea to hide payment values in homomorphic commitments to obtain the ValueShuffle protocol, which enables us to overcome major obstacles to the practical deployment of coin mixing protocols. Fourth, we show how to prepare the aforementioned homomorphic commitments for a safe transition to post-quantum cryptography.Seit seiner revolutionären Erfindung durch Satoshi Nakamoto im Jahr 2008 wurden zahlreiche kryptographische Erweiterungen für Bitcoin vorgeschlagen. Gleichwohl wurden nur wenige Vorschläge in Bitcoin und andere weit verbreitete Kryptowährungen integriert, deren Resistenz gegen tiefgreifende Veränderungen augenscheinlich mit ihrer Verbreitung wächst. In dieser Dissertation schlagen wir vier kryptographische Verfahren vor, die die Funktionalität und die Datenschutzeigenschaften von Bitcoin und ähnlichen Kryptowährungen verbessern ohne deren Funktionsweise tiefgreifend verändern zu müssen. Erstens realisieren wir Smart Contracts, die es erlauben widersprüchliche Aussagen einer Vertragspartei mit dem Verlust von Kryptogeld zu bestrafen. Zweitens schlagen wir CoinShuffle++ vor, ein Mix-Protokoll, das die Anonymität von Benutzern verbessert, indem es ihre Transaktionen kombiniert und so deren Rückverfolgung erschwert. Sein Herzstück ist DiceMix, ein neues und effizientes Protokoll zur anonymen Veröffentlichung von Nachrichten ohne vertrauenswürdige Dritte und in der Präsenz von bösartigen Teilnehmern. Drittens kombinieren wir dieses Protokoll mit der existierenden Idee, Geldbeträge in Commitments zu verbergen, und erhalten so das ValueShuffle-Protokoll, das uns ermöglicht, große Hindernisse für den praktischen Einsatz von Mix-Protokollen zu überwinden. Viertens zeigen wir, wie die dabei benutzten Commitments für einen sicheren Übergang zu Post-Quanten-Kryptographie vorbereitet werden können

    A Neoclassical Realist’s Analysis Of Sino-U.S. Space Policy

    Get PDF
    During the Cold War, the United States focused its collective policy acumen on forming a competitive, actor-specific strategy to gain advantage over the Soviet Union. The fragmentation of the Soviet Union resulted in a multi-polar geopolitical environment lacking a near-peer rival for the United States. Overwhelming soft and hard power advantages allowed American policy makers to peruse a general, non-actor specific strategy to maintain its hegemonic position. However, the meteoric rise of China as a near-peer competitor in East Asia has challenged this paradigm. In order to maintain its competitive advantage, or at the very least ensure the safety of its geopolitical objectives through encouraging benign competition, U.S. strategy needs to evolve in both focus and complexity. It is essential for Spacepower, as a key element of national power, to be included in this evolution. In order to do so, this analysis will examine Sino-U.S. space relations using neoclassical realism as a baseline methodology. First, structural elements of the Sino-U.S. relationship will be modeled in a semi-quantitative game theoretical framework, using relative economic and military capabilities as primary independent variables. Second, key assumptions will be tested to ensure that this model accurately represents the current geopolitical environment. Third, the decision making apparatuses of the United States and China will be examined as intervening variables. This will account for imperfect rationality and how it modifies the game theoretical framework. Fourth, this framework will be used to present actionable space policy recommendations for the United States so that space can be incorporated into a competitive strategy for East Asia

    On Information-centric Resiliency and System-level Security in Constrained, Wireless Communication

    Get PDF
    The Internet of Things (IoT) interconnects many heterogeneous embedded devices either locally between each other, or globally with the Internet. These things are resource-constrained, e.g., powered by battery, and typically communicate via low-power and lossy wireless links. Communication needs to be secured and relies on crypto-operations that are often resource-intensive and in conflict with the device constraints. These challenging operational conditions on the cheapest hardware possible, the unreliable wireless transmission, and the need for protection against common threats of the inter-network, impose severe challenges to IoT networks. In this thesis, we advance the current state of the art in two dimensions. Part I assesses Information-centric networking (ICN) for the IoT, a network paradigm that promises enhanced reliability for data retrieval in constrained edge networks. ICN lacks a lower layer definition, which, however, is the key to enable device sleep cycles and exclusive wireless media access. This part of the thesis designs and evaluates an effective media access strategy for ICN to reduce the energy consumption and wireless interference on constrained IoT nodes. Part II examines the performance of hardware and software crypto-operations, executed on off-the-shelf IoT platforms. A novel system design enables the accessibility and auto-configuration of crypto-hardware through an operating system. One main focus is the generation of random numbers in the IoT. This part of the thesis further designs and evaluates Physical Unclonable Functions (PUFs) to provide novel randomness sources that generate highly unpredictable secrets, on low-cost devices that lack hardware-based security features. This thesis takes a practical view on the constrained IoT and is accompanied by real-world implementations and measurements. We contribute open source software, automation tools, a simulator, and reproducible measurement results from real IoT deployments using off-the-shelf hardware. The large-scale experiments in an open access testbed provide a direct starting point for future research

    Mark Oliphant and the Invisible College of the Peaceful Atom

    Get PDF
    The weapon first created by atomic scientists of the 1940s was unprecedented in its power and potential to kill. Not only can it destroy infrastructure and all living things over a wide area, it leaves a haunting invisible footprint of radiation that can continue to harm long after its heat has dissipated. The atomic bomb was first conceptualised, proven and built by civilian scientists and overseen by an ambitious military and wary bureaucrats. The scientists belligerently lobbied their governments to take the potential of atomic weaponry seriously and it is hence not surprising that they are often portrayed as ghoulishly mad savants who strung the bow of mass destruction.1 The atomic bomb proved such an effective killing machine that it provoked the Anglo- Australian physicist, Sir Ernest Titterton, to include a chapter in his 1956 book, Facing an Atomic Future, entitled ‘The Economics of Slaughter’.2 Titterton presented grotesque calculations that suggested atomic weaponry could kill for as little as ‘2½ d [pence] per man, woman and child’.3 The atomic bomb, as we know, played a decisive hand in the end of the world’s most deadly war—World War Two. During the Cold War the role of the atomic bomb—and its even more devastating offspring, the thermonuclear hydrogen bomb—caused tension, anxiety and outright fear as the world’s superpowers faced off in an arms race in which all-out conflict could have resulted in the end of humanity. The story of the twentieth century is, in many respects, the story of the atom. During the early years the investigations into the structure of the atom were centred in powerful European nations such as Britain, Germany and France. But during the war the United States borrowed scientists and the knowledge from Europe and combined it with resources and enterprise to efficiently produce the technology for the final vanquishing moments of World War Two. This rise of American atomic utility continued into the Cold War arms race. In addition, postwar, industry looked in wonderment at the technology achieved during the war and saw how productive large groups of collaborating scientists could be. The postwar technological age was, in part, a product of a change of mode in scientific research from the university to government, military, and private enterprise. The origins of the atomic age can be traced to Henri Becquerel and Marie and Pierre Curie’s discovery of radiation in the late nineteenth century; Albert Einstein’s Special Theory of Relativity in 1905; and Ernest Rutherford’s proof on the structure of the atom in 1909.4 The atomic age reached a crescendo with the dropping of atomic bombs that smote Japan in August 1945. There are several names that history links particularly to the atomic bomb, including the Germans Otto Hahn and Friedrich Strassman, who split the uranium atom in 1938; Austrians Lise Meitner and Otto Frisch, who first explained this as nuclear fission in 1939; the Hungarian Leo Szilard, who theorised an uncontrolled nuclear explosion in the same year; Enrico Fermi, the Italian who built the first nuclear reactor; and the eccentric American polymath, Robert Oppenheimer, who led the Manhattan Project to build the first bombs. Yet in the background was Mark Oliphant—a remarkable Australian scientist whose intellect, likeable and roguish personality, and international friendships helped stitch together this vast patchwork of scientists that made the bomb possible

    Entangled Matters: Analogue Futures & Political Pasts

    Get PDF
    Theorised as an "ontology of the output" my research project conceptually repurposes media machines in order to activate new or alternate entanglements between historical media artefacts and events. Although the particular circumstances that produced these materials may have changed, the project asks why these analogue media artefacts might still be a matter of concern. What is their relevance for problematizing debates within media philosophy today and by extension the politics that underscore the operations of the digital? Does the analogue as I intuit have the capacity to release history and propose alternate pathways through mediatic time? Case Studies: ARCHIVAL FUTURES considers the missing or 'silent' erasure of 18-'12 minutes in Watergate Tape No. 342 (1972). TELE-TRANSMISSIONS explores the 14-minute audio transmission produced by the Muirhead K220 Picture Transmitter to relay the image of napalm victim Kim Phuc from Saigon to Tokyo (June 8 1972). RADIOLOGICAL EVENTS examines thirty-three seconds of irradiated film shot at Chernobyl Reactor Unit 4 by the late Soviet filmmaker Vladimir Shevchenko (April 26 1986). This research turns upon a reconsideration of the ontological temporalities of media matter; a concern both in and with time which acknowledges that each of the now historic machinic artefacts and related case studies have always-already been entangled with the present and coming events of the future. The thesis project as such performs itself as a kind of "tape cutup" that reorganises and consequently troubles the historical record by bringing ostensibly unrelated events into creative juxtaposition with one another. Recording asserts temporality; it is the formal means by which time is engineered, how it is both retroactively repotentialised and prospectively activated. Recording in effect produces a saturated ontology of time in which the reverberations of past, present, and future elide to become enfolded within the temporal vectors of the artefact

    Sovereignty in Ruins

    Get PDF
    Featuring essays by some of the most prominent names in contemporary political and cultural theory, Sovereignty in Ruins presents a form of critique grounded in the conviction that political thought is itself an agent of crisis. Aiming to develop a political vocabulary capable of critiquing and transforming contemporary political frameworks, the contributors advance a politics of crisis that collapses the false dichotomies between sovereignty and governmentality and between critique and crisis. Their essays address a wide range of topics, such as the role history plays in the development of a politics of crisis; Arendt's controversial judgment of Adolf Eichmann; Strauss's and Badiou's readings of Plato's Laws; the acceptance of the unacceptable; the human and nonhuman; and flesh as a biopolitical category representative of the ongoing crisis of modernity

    What do we mean when we talk about “safe space”?: a philosophical exploration of a contentious metaphor in education

    Get PDF
    Educators have described their classes and institutions as “safe spaces” with increasing frequency and certainty since the 1990s. However, philosophers of education such as Eamon Callan, Cris Mayo, and Sigal Ben-Porath have found “safe space” to be conceptually and pedagogically lacking when interpreted from intersectional positionalities operating within the hegemonic white, masculine, and consumerist discourses permeating a modern educational system that strives for greater equity, diversity, and inclusion. This work operationalizes “safe space” by recognizing it as what linguists Max Black, George Lakoff and Mark Johnson, and philosopher Paul Ricoeur would term a conceptual metaphor, which structures thinking about education. Critical pedagogues such as Michael Apple, Raymond Callahan, Paulo Freire, Ivan Illich, Herbert Kliebard, and Peter McLaren have argued how this type of structured thinking can influence pedagogical practices; but to date, no in-depth philosophical analysis of “safe space” exists in the literature. Interrogating modern debates about the nature of “space” inherited from Isaac Newton (who viewed it as an absolute container filled with independent subjects/objects), and Gottfried Leibniz (who viewed space as an infinite set of relations between subjects/objects), the implications for any educationally worthwhile understanding and practice of “safety” or “safe space” are shown to be suspect due to the Newtonian inheritances. Ultimately, I posit that “safe space” is unavoidably Newtonian – assumed to be capable of formulation a priori such that students are entitled to a guarantee that a class space will be safe in some sense that can be unambiguously stated, irrespective of who is taking the class, what the class is about, and what is going on in the world. This a priori safe space is then one that institutions feel responsible for guaranteeing, teachers feel responsible for creating and maintaining, with students feeling no responsibility other than reaping its benefits. Linking this work’s conceptual analysis of the Leibnizian inheritances to “space” and “safety” (understood as infinitely relational) to that of critical pedagogues such as bell hooks, I argue for a more philosophically grounded and educationally worthwhile understanding of “safe space”.Safe Space; Philosophy of Education; Paul Ricoeur; Isaac Newton; Gottfried Leibniz, bell hooks; Conceptual Metaphor; Equity, Diversity, and Inclusion (EDI

    What do we mean when we talk about “safe space”? A philosophical exploration of a contentious metaphor in education

    Get PDF
    Educators have described their classes and institutions as “safe spaces” with increasing frequency and certainty since the 1990s. However, philosophers of education such as Eamon Callan, Cris Mayo, and Sigal Ben-Porath have found “safe space” to be conceptually and pedagogically lacking when interpreted from intersectional positionalities operating within the hegemonic white, masculine, and consumerist discourses permeating a modern educational system that strives for greater equity, diversity, and inclusion. This work operationalizes “safe space” by recognizing it as what linguists Max Black, George Lakoff and Mark Johnson, and philosopher Paul Ricoeur would term a conceptual metaphor, which structures thinking about education. Critical pedagogues such as Michael Apple, Raymond Callahan, Paulo Freire, Ivan Illich, Herbert Kliebard, and Peter McLaren have argued how this type of structured thinking can influence pedagogical practices; but to date, no in-depth philosophical analysis of “safe space” exists in the literature. Interrogating modern debates about the nature of “space” inherited from Isaac Newton (who viewed it as an absolute container filled with independent subjects/objects), and Gottfried Leibniz (who viewed space as an infinite set of relations between subjects/objects), the implications for any educationally worthwhile understanding and practice of “safety” or “safe space” are shown to be suspect due to the Newtonian inheritances. Ultimately, I posit that “safe space” is unavoidably Newtonian – assumed to be capable of formulation a priori such that students are entitled to a guarantee that a class space will be safe in some sense that can be unambiguously stated, irrespective of who is taking the class, what the class is about, and what is going on in the world. This a priori safe space is then one that institutions feel responsible for guaranteeing, teachers feel responsible for creating and maintaining, with students feeling no responsibility other than reaping its benefits. Linking this work’s conceptual analysis of the Leibnizian inheritances to “space” and “safety” (understood as infinitely relational) to that of critical pedagogues such as bell hooks, I argue for a more philosophically grounded and educationally worthwhile understanding of “safe space”
    corecore