1,033 research outputs found

    Theories of Reference: What Was the Question?

    Get PDF
    The new theory of reference has won popularity. However, a number of noted philosophers have also attempted to reply to the critical arguments of Kripke and others, and aimed to vindicate the description theory of reference. Such responses are often based on ingenious novel kinds of descriptions, such as rigidified descriptions, causal descriptions, and metalinguistic descriptions. This prolonged debate raises the doubt whether different parties really have any shared understanding of what the central question of the philosophical theory of reference is: what is the main question to which descriptivism and the causal-historical theory have presented competing answers. One aim of the paper is to clarify this issue. The most influential objections to the new theory of reference are critically reviewed. Special attention is also paid to certain important later advances in the new theory of reference, due to Devitt and others

    Integration of highly probabilistic sources into optical quantum architectures: perpetual quantum computation

    Full text link
    In this paper we introduce a design for an optical topological cluster state computer constructed exclusively from a single quantum component. Unlike previous efforts we eliminate the need for on demand, high fidelity photon sources and detectors and replace them with the same device utilised to create photon/photon entanglement. This introduces highly probabilistic elements into the optical architecture while maintaining complete specificity of the structure and operation for a large scale computer. Photons in this system are continually recycled back into the preparation network, allowing for a arbitrarily deep 3D cluster to be prepared using a comparatively small number of photonic qubits and consequently the elimination of high frequency, deterministic photon sources.Comment: 19 pages, 13 Figs (2 Appendices with additional Figs.). Comments welcom

    Effects of imperfections for Shor's factorization algorithm

    Full text link
    We study effects of imperfections induced by residual couplings between qubits on the accuracy of Shor's algorithm using numerical simulations of realistic quantum computations with up to 30 qubits. The factoring of numbers up to N=943 show that the width of peaks, which frequencies allow to determine the factors, grow exponentially with the number of qubits. However, the algorithm remains operational up to a critical coupling strength ϵc\epsilon_c which drops only polynomially with log2N\log_2 N. The numerical dependence of ϵc\epsilon_c on log2N\log_2 N is explained by analytical estimates that allows to obtain the scaling for functionality of Shor's algorithm on realistic quantum computers with a large number of qubits.Comment: 10 pages, 10 figures, 1 table. Added references and new data. Erratum added as appendix. 1 Figure and 1 Table added. Research is available at http://www.quantware.ups-tlse.fr

    Simulating chemistry efficiently on fault-tolerant quantum computers

    Get PDF
    Quantum computers can in principle simulate quantum physics exponentially faster than their classical counterparts, but some technical hurdles remain. Here we consider methods to make proposed chemical simulation algorithms computationally fast on fault-tolerant quantum computers in the circuit model. Fault tolerance constrains the choice of available gates, so that arbitrary gates required for a simulation algorithm must be constructed from sequences of fundamental operations. We examine techniques for constructing arbitrary gates which perform substantially faster than circuits based on the conventional Solovay-Kitaev algorithm [C.M. Dawson and M.A. Nielsen, \emph{Quantum Inf. Comput.}, \textbf{6}:81, 2006]. For a given approximation error ϵ\epsilon, arbitrary single-qubit gates can be produced fault-tolerantly and using a limited set of gates in time which is O(logϵ)O(\log \epsilon) or O(loglogϵ)O(\log \log \epsilon); with sufficient parallel preparation of ancillas, constant average depth is possible using a method we call programmable ancilla rotations. Moreover, we construct and analyze efficient implementations of first- and second-quantized simulation algorithms using the fault-tolerant arbitrary gates and other techniques, such as implementing various subroutines in constant time. A specific example we analyze is the ground-state energy calculation for Lithium hydride.Comment: 33 pages, 18 figure

    Precision characterisation of two-qubit Hamiltonians via entanglement mapping

    Full text link
    We show that the general Heisenberg Hamiltonian with non-uniform couplings can be characterised by mapping the entanglement it generates as a function of time. Identification of the Hamiltonian in this way is possible as the coefficients of each operator control the oscillation frequencies of the entanglement function. The number of measurements required to achieve a given precision in the Hamiltonian parameters is determined and an efficient measurement strategy designed. We derive the relationship between the number of measurements, the resulting precision and the ultimate discrete error probability generated by a systematic mis-characterisation, when implementing two-qubit gates for quantum computing.Comment: 6 Pages, 3 figure

    Laparoscopic repair of very large hiatus hernia with sutures versus absorbable mesh versus nonabsorbable mesh a randomized controlled trial

    Get PDF
    Author version made available in accordance with pubilsher policy. 12 month embargo applies from the date of publication (1 Feb 2015).Objective: Determine whether absorbable or non-absorbable mesh in repair of large hiatus hernias reduces the risk of recurrence, compared to suture repair. Summary Background Data: Repair of large hiatus hernia is associated with radiological recurrence rates of up to 30%, and to improve outcomes mesh repair has been recommended. Previous trials have shown less short term recurrence with mesh, but adverse outcomes limit mesh use. Methods: Multicentre prospective double blind randomized controlled trial of 3 methods of repair; sutures vs. absorbable mesh vs. non-absorbable mesh. Primary outcome - hernia recurrence assessed by barium meal X-ray and endoscopy at 6 months. Secondary outcomes - clinical symptom scores at 1, 3, 6 and 12 months. Results: 126 patients enrolled - 43 sutures, 41 absorbable mesh and 42 non-absorbable mesh. 96.0% were followed to 12 months, with objective follow-up data in 92.9%. A recurrent hernia (any size) was identified in 23.1% following suture repair, 30.8% - absorbable mesh, and 12.8% - non-absorbable mesh (p=0.161). Clinical outcomes were similar, except less heartburn at 3 & 6 months and less bloating at 12 months with non-absorbable mesh, and more heartburn at 3 months, odynophagia at 1 month, nausea at 3 & 12 months, wheezing at 6 months, and inability to belch at 12 months following absorbable mesh. The magnitude of the clinical differences were small. Conclusions: No significant differences were seen for recurrent hiatus hernia, and the clinical differences were unlikely to be clinically significant. Overall outcomes following sutured repair were similar to mesh repair

    Justifying the Special Theory of Relativity with Unconceived Methods

    Get PDF
    Many realists argue that present scientific theories will not follow the fate of past scientific theories because the former are more successful than the latter. Critics object that realists need to show that present theories have reached the level of success that warrants their truth. I reply that the special theory of relativity has been repeatedly reinforced by unconceived scientific methods, so it will be reinforced by infinitely many unconceived scientific methods. This argument for the special theory of relativity overcomes the critics’ objection, and has advantages over the no-miracle argument and the selective induction for it

    Surface code quantum computing by lattice surgery

    Full text link
    In recent years, surface codes have become a leading method for quantum error correction in theoretical large scale computational and communications architecture designs. Their comparatively high fault-tolerant thresholds and their natural 2-dimensional nearest neighbour (2DNN) structure make them an obvious choice for large scale designs in experimentally realistic systems. While fundamentally based on the toric code of Kitaev, there are many variants, two of which are the planar- and defect- based codes. Planar codes require fewer qubits to implement (for the same strength of error correction), but are restricted to encoding a single qubit of information. Interactions between encoded qubits are achieved via transversal operations, thus destroying the inherent 2DNN nature of the code. In this paper we introduce a new technique enabling the coupling of two planar codes without transversal operations, maintaining the 2DNN of the encoded computer. Our lattice surgery technique comprises splitting and merging planar code surfaces, and enables us to perform universal quantum computation (including magic state injection) while removing the need for braided logic in a strictly 2DNN design, and hence reduces the overall qubit resources for logic operations. Those resources are further reduced by the use of a rotated lattice for the planar encoding. We show how lattice surgery allows us to distribute encoded GHZ states in a more direct (and overhead friendly) manner, and how a demonstration of an encoded CNOT between two distance 3 logical states is possible with 53 physical qubits, half of that required in any other known construction in 2D.Comment: Published version. 29 pages, 18 figure
    corecore