1,227 research outputs found
Surface code implementation of block code state distillation
State distillation is the process of taking a number of imperfect copies of a
particular quantum state and producing fewer better copies. Until recently, the
lowest overhead method of distilling states |A>=(|0>+e^{i\pi/4}|1>)/\sqrt{2}
produced a single improved |A> state given 15 input copies. New block code
state distillation methods can produce k improved |A> states given 3k+8 input
copies, potentially significantly reducing the overhead associated with state
distillation. We construct an explicit surface code implementation of block
code state distillation and quantitatively compare the overhead of this
approach to the old. We find that, using the best available techniques, for
parameters of practical interest, block code state distillation does not always
lead to lower overhead, and, when it does, the overhead reduction is typically
less than a factor of three.Comment: 26 pages, 28 figure
Theories of Reference: What Was the Question?
The new theory of reference has won popularity. However, a number of noted philosophers have also attempted to reply to the critical arguments of Kripke and others, and aimed to vindicate the description theory of reference. Such responses are often based on ingenious novel kinds of descriptions, such as rigidified descriptions, causal descriptions, and metalinguistic descriptions. This prolonged debate raises the doubt whether different parties really have any shared understanding of what the central question of the philosophical theory of reference is: what is the main question to which descriptivism and the causal-historical theory have presented competing answers. One aim of the paper is to clarify this issue. The most influential objections to the new theory of reference are critically reviewed. Special attention is also paid to certain important later advances in the new theory of reference, due to Devitt and others
Subspace confinement : how good is your qubit?
The basic operating element of standard quantum computation is the qubit, an isolated two-level system that can be accurately controlled, initialized and measured. However, the majority of proposed physical architectures for quantum computation are built from systems that contain much more complicated Hilbert space structures. Hence, defining a qubit requires the identification of an appropriate controllable two-dimensional sub-system. This prompts the obvious question of how well a qubit, thus defined, is confined to this subspace, and whether we can experimentally quantify the potential leakage into states outside the qubit subspace. We demonstrate how subspace leakage can be characterized using minimal theoretical assumptions by examining the Fourier spectrum of the oscillation experiment
Integration of highly probabilistic sources into optical quantum architectures: perpetual quantum computation
In this paper we introduce a design for an optical topological cluster state
computer constructed exclusively from a single quantum component. Unlike
previous efforts we eliminate the need for on demand, high fidelity photon
sources and detectors and replace them with the same device utilised to create
photon/photon entanglement. This introduces highly probabilistic elements into
the optical architecture while maintaining complete specificity of the
structure and operation for a large scale computer. Photons in this system are
continually recycled back into the preparation network, allowing for a
arbitrarily deep 3D cluster to be prepared using a comparatively small number
of photonic qubits and consequently the elimination of high frequency,
deterministic photon sources.Comment: 19 pages, 13 Figs (2 Appendices with additional Figs.). Comments
welcom
- …
