5,014 research outputs found

    Relaying and routing in wireless networks: a throughput comparison

    Get PDF

    Quantum Nonlocality for a Mixed Entangled Coherent State

    Get PDF
    Quantum nonlocality is tested for an entangled coherent state, interacting with a dissipative environment. A pure entangled coherent state violates Bell's inequality regardless of its coherent amplitude. The higher the initial nonlocality, the more rapidly quantum nonlocality is lost. The entangled coherent state can also be investigated in the framework of 2×22\times2 Hilbert space. The quantum nonlocality persists longer in 2×22\times2 Hilbert space. When it decoheres it is found that the entangled coherent state fails the nonlocality test, which contrasts with the fact that the decohered entangled state is always entangled.Comment: 20 pages, 7 figures. To be published in J. Mod. Op

    Efficient optical quantum information processing

    Full text link
    Quantum information offers the promise of being able to perform certain communication and computation tasks that cannot be done with conventional information technology (IT). Optical Quantum Information Processing (QIP) holds particular appeal, since it offers the prospect of communicating and computing with the same type of qubit. Linear optical techniques have been shown to be scalable, but the corresponding quantum computing circuits need many auxiliary resources. Here we present an alternative approach to optical QIP, based on the use of weak cross-Kerr nonlinearities and homodyne measurements. We show how this approach provides the fundamental building blocks for highly efficient non-absorbing single photon number resolving detectors, two qubit parity detectors, Bell state measurements and finally near deterministic control-not (CNOT) gates. These are essential QIP devicesComment: Accepted to the Journal of optics B special issue on optical quantum computation; References update

    Single photon quantum non-demolition in the presence of inhomogeneous broadening

    Get PDF
    Electromagnetically induced transparency (EIT) has been often proposed for generating nonlinear optical effects at the single photon level; in particular, as a means to effect a quantum non-demolition measurement of a single photon field. Previous treatments have usually considered homogeneously broadened samples, but realisations in any medium will have to contend with inhomogeneous broadening. Here we reappraise an earlier scheme [Munro \textit{et al.} Phys. Rev. A \textbf{71}, 033819 (2005)] with respect to inhomogeneities and show an alternative mode of operation that is preferred in an inhomogeneous environment. We further show the implications of these results on a potential implementation in diamond containing nitrogen-vacancy colour centres. Our modelling shows that single mode waveguide structures of length 200μm200 \mu\mathrm{m} in single-crystal diamond containing a dilute ensemble of NV−^- of only 200 centres are sufficient for quantum non-demolition measurements using EIT-based weak nonlinear interactions.Comment: 21 pages, 9 figures (some in colour) at low resolution for arXiv purpose

    On the integration of concurrency, distribution and persistence

    Get PDF
    The principal tenet of the persistence model is that it abstracts over all the physical properties of data such as how long it is stored, where it is stored, how it is stored, what form it is kept in and who is using it. Experience with programming systems which support orthogonal persistence has shown that the simpler semantics and reduced complexity can often lead to a significant reduction in software production costs. Persistent systems are relatively new and it is not yet clear which of the many models of concurrency and distribution best suit the persistence paradigm. Previous work in this area has tended to build one chosen model into the system which may then only be applicable to a particular set of problems. This thesis challenges the orthodoxy by designing a persistent framework in which all models of concurrency and distribution can be integrated in an add-on fashion. The provision of such a framework is complicated by a tension between the conceptual ideas of persistence and the intrinsics of concurrency and distribution. The approach taken is to integrate the spectra of concurrency and distribution abstractions into the persistence model in a manner that does not prevent the user from being able to reason about program behaviour. As examples of the reference model a number of different styles of concurrency and distribution have been designed and incorporated into the persistent programming system Napier88. A detailed treatment of these models and their implementations is given

    qBitcoin: A Peer-to-Peer Quantum Cash System

    Full text link
    A decentralized online quantum cash system, called qBitcoin, is given. We design the system which has great benefits of quantization in the following sense. Firstly, quantum teleportation technology is used for coin transaction, which prevents from the owner of the coin keeping the original coin data even after sending the coin to another. This was a main problem in a classical circuit and a blockchain was introduced to solve this issue. In qBitcoin, the double-spending problem never happens and its security is guaranteed theoretically by virtue of quantum information theory. Making a block is time consuming and the system of qBitcoin is based on a quantum chain, instead of blocks. Therefore a payment can be completed much faster than Bitcoin. Moreover we employ quantum digital signature so that it naturally inherits properties of peer-to-peer (P2P) cash system as originally proposed in Bitcoin.Comment: 11 pages, 2 figure

    Integration of highly probabilistic sources into optical quantum architectures: perpetual quantum computation

    Full text link
    In this paper we introduce a design for an optical topological cluster state computer constructed exclusively from a single quantum component. Unlike previous efforts we eliminate the need for on demand, high fidelity photon sources and detectors and replace them with the same device utilised to create photon/photon entanglement. This introduces highly probabilistic elements into the optical architecture while maintaining complete specificity of the structure and operation for a large scale computer. Photons in this system are continually recycled back into the preparation network, allowing for a arbitrarily deep 3D cluster to be prepared using a comparatively small number of photonic qubits and consequently the elimination of high frequency, deterministic photon sources.Comment: 19 pages, 13 Figs (2 Appendices with additional Figs.). Comments welcom
    • …
    corecore