3,160 research outputs found

    The Many Worlds of Uncertainty

    Full text link
    The status of the uncertainty relations varies between the different interpretations of quantum mechanics. The aim of the current paper is to explore their meanings within a certain neo-Everettian many worlds interpretation. We will also look at questions that have been linked with the uncertainty relations since Heisenberg's uncertainty principle: those of joint and repeated measurement of non-commuting (or otherwise `incompatible') observables. This will have implications beyond the uncertainty relations, as we will see the fundamentally different way in which statistical statements are interpreted in the neo-Everett theory that we use.Comment: 9 page

    Quantum Computation and Many Worlds

    Full text link
    An Everett (`Many Worlds') interpretation of quantum mechanics due to Saunders and Zurek is presented in detail. This is used to give a physical description of the process of a quantum computation. Objections to such an understanding are discussed.Comment: This paper has been superceded by arXiv:0802.2504v1 [quant-ph

    DEMONIC programming: a computational language for single-particle equilibrium thermodynamics, and its formal semantics

    Full text link
    Maxwell's Demon, 'a being whose faculties are so sharpened that he can follow every molecule in its course', has been the centre of much debate about its abilities to violate the second law of thermodynamics. Landauer's hypothesis, that the Demon must erase its memory and incur a thermodynamic cost, has become the standard response to Maxwell's dilemma, and its implications for the thermodynamics of computation reach into many areas of quantum and classical computing. It remains, however, still a hypothesis. Debate has often centred around simple toy models of a single particle in a box. Despite their simplicity, the ability of these systems to accurately represent thermodynamics (specifically to satisfy the second law) and whether or not they display Landauer Erasure, has been a matter of ongoing argument. The recent Norton-Ladyman controversy is one such example. In this paper we introduce a programming language to describe these simple thermodynamic processes, and give a formal operational semantics and program logic as a basis for formal reasoning about thermodynamic systems. We formalise the basic single-particle operations as statements in the language, and then show that the second law must be satisfied by any composition of these basic operations. This is done by finding a computational invariant of the system. We show, furthermore, that this invariant requires an erasure cost to exist within the system, equal to kTln2 for a bit of information: Landauer Erasure becomes a theorem of the formal system. The Norton-Ladyman controversy can therefore be resolved in a rigorous fashion, and moreover the formalism we introduce gives a set of reasoning tools for further analysis of Landauer erasure, which are provably consistent with the second law of thermodynamics.Comment: In Proceedings QPL 2015, arXiv:1511.01181. Dominic Horsman published previously as Clare Horsma

    A Case-Based Reasoning Method for Locating Evidence During Digital Forensic Device Triage

    Get PDF
    The role of triage in digital forensics is disputed, with some practitioners questioning its reliability for identifying evidential data. Although successfully implemented in the field of medicine, triage has not established itself to the same degree in digital forensics. This article presents a novel approach to triage for digital forensics. Case-Based Reasoning Forensic Triager (CBR-FT) is a method for collecting and reusing past digital forensic investigation information in order to highlight likely evidential areas on a suspect operating system, thereby helping an investigator to decide where to search for evidence. The CBR-FT framework is discussed and the results of twenty test triage examinations are presented. CBR-FT has been shown to be a more effective method of triage when compared to a practitioner using a leading commercial application

    Optimising the Solovay-Kitaev algorithm

    Full text link
    The Solovay-Kitaev algorithm is the standard method used for approximating arbitrary single-qubit gates for fault-tolerant quantum computation. In this paper we introduce a technique called "search space expansion", which modifies the initial stage of the Solovay-Kitaev algorithm, increasing the length of the possible approximating sequences but without requiring an exhaustive search over all possible sequences. We show that our technique, combined with a GNAT geometric tree search outputs gate sequences that are almost an order of magnitude smaller for the same level of accuracy. This therefore significantly reduces the error correction requirements for quantum algorithms on encoded fault-tolerant hardware.Comment: 9 page

    USING GRADUATED GUIDANCE TO TEACH IMITATION OF MANUAL SIGNS TO CHILDREN WITH INTELLECTUAL DISABILITIES

    Get PDF
    The purpose of this study was to test the effects of graduated guidance procedure on teaching imitation of manual signs to students with moderate to severe disabilities. Sessions began with student initiation and were embedded across already established reinforcement routines across the student’s day. A multiple baseline across participants design was used to evaluate these effects
    • …
    corecore