880 research outputs found

    Quantum key distribution using gaussian-modulated coherent states

    Full text link
    Quantum continuous variables are being explored as an alternative means to implement quantum key distribution, which is usually based on single photon counting. The former approach is potentially advantageous because it should enable higher key distribution rates. Here we propose and experimentally demonstrate a quantum key distribution protocol based on the transmission of gaussian-modulated coherent states (consisting of laser pulses containing a few hundred photons) and shot-noise-limited homodyne detection; squeezed or entangled beams are not required. Complete secret key extraction is achieved using a reverse reconciliation technique followed by privacy amplification. The reverse reconciliation technique is in principle secure for any value of the line transmission, against gaussian individual attacks based on entanglement and quantum memories. Our table-top experiment yields a net key transmission rate of about 1.7 megabits per second for a loss-free line, and 75 kilobits per second for a line with losses of 3.1 dB. We anticipate that the scheme should remain effective for lines with higher losses, particularly because the present limitations are essentially technical, so that significant margin for improvement is available on both the hardware and software.Comment: 8 pages, 4 figure

    Continuous Variable Quantum Cryptography using Two-Way Quantum Communication

    Full text link
    Quantum cryptography has been recently extended to continuous variable systems, e.g., the bosonic modes of the electromagnetic field. In particular, several cryptographic protocols have been proposed and experimentally implemented using bosonic modes with Gaussian statistics. Such protocols have shown the possibility of reaching very high secret-key rates, even in the presence of strong losses in the quantum communication channel. Despite this robustness to loss, their security can be affected by more general attacks where extra Gaussian noise is introduced by the eavesdropper. In this general scenario we show a "hardware solution" for enhancing the security thresholds of these protocols. This is possible by extending them to a two-way quantum communication where subsequent uses of the quantum channel are suitably combined. In the resulting two-way schemes, one of the honest parties assists the secret encoding of the other with the chance of a non-trivial superadditive enhancement of the security thresholds. Such results enable the extension of quantum cryptography to more complex quantum communications.Comment: 12 pages, 7 figures, REVTe

    Continuous variable quantum key distribution with two-mode squeezed states

    Full text link
    Quantum key distribution (QKD) enables two remote parties to grow a shared key which they can use for unconditionally secure communication [1]. The applicable distance of a QKD protocol depends on the loss and the excess noise of the connecting quantum channel [2-10]. Several QKD schemes based on coherent states and continuous variable (CV) measurements are resilient to high loss in the channel, but strongly affected by small amounts of channel excess noise [2-6]. Here we propose and experimentally address a CV QKD protocol which uses fragile squeezed states combined with a large coherent modulation to greatly enhance the robustness to channel noise. As a proof of principle we experimentally demonstrate that the resulting QKD protocol can tolerate more noise than the benchmark set by the ideal CV coherent state protocol. Our scheme represents a very promising avenue for extending the distance for which secure communication is possible.Comment: 8 pages, 5 figure

    Quantum memory for entangled two-mode squeezed states

    Full text link
    A quantum memory for light is a key element for the realization of future quantum information networks. Requirements for a good quantum memory are (i) versatility (allowing a wide range of inputs) and (ii) true quantum coherence (preserving quantum information). Here we demonstrate such a quantum memory for states possessing Einstein-Podolsky-Rosen (EPR) entanglement. These multi-photon states are two-mode squeezed by 6.0 dB with a variable orientation of squeezing and displaced by a few vacuum units. This range encompasses typical input alphabets for a continuous variable quantum information protocol. The memory consists of two cells, one for each mode, filled with cesium atoms at room temperature with a memory time of about 1msec. The preservation of quantum coherence is rigorously proven by showing that the experimental memory fidelity 0.52(2) significantly exceeds the benchmark of 0.45 for the best possible classical memory for a range of displacements.Comment: main text 5 pages, supplementary information 3 page

    The RR Lyrae Distance Scale

    Get PDF
    We review seven methods of measuring the absolute magnitude M_V of RR Lyrae stars in light of the Hipparcos mission and other recent developments. We focus on identifying possible systematic errors and rank the methods by relative immunity to such errors. For the three most robust methods, statistical parallax, trigonometric parallax, and cluster kinematics, we find M_V (at [Fe/H] = -1.6) of 0.77 +/- 0.13, 0.71 +/- 0.15, 0.67 +/- 0.10. These methods cluster consistently around 0.71 +/- 0.07. We find that Baade-Wesselink and theoretical models both yield a broad range of possible values (0.45-0.70 and 0.45-0.65) due to systematic uncertainties in the temperature scale and input physics. Main-sequence fitting gives a much brighter M_V = 0.45 +/- 0.04 but this may be due to a difference in the metallicity scales of the cluster giants and the calibrating subdwarfs. White-dwarf cooling-sequence fitting gives 0.67 +/- 0.13 and is potentially very robust, but at present is too new to be fully tested for systematics. If the three most robust methods are combined with Walker's mean measurement for 6 LMC clusters, V_{0,LMC} = 18.98 +/- 0.03 at [Fe/H] = -1.9, then mu_{LMC} = 18.33 +/- 0.08.Comment: Invited review article to appear in: `Post-Hipparcos Cosmic Candles', A. Heck & F. Caputo (Eds), Kluwer Academic Publ., Dordrecht, in press. 21 pages including 1 table; uses Kluwer's crckapb.sty LaTeX style file, enclose

    Mathematical modelling of polyamine metabolism in bloodstream-form trypanosoma brucei: An application to drug target identification

    Get PDF
    © 2013 Gu et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are creditedThis article has been made available through the Brunel Open Access Publishing Fund.We present the first computational kinetic model of polyamine metabolism in bloodstream-form Trypanosoma brucei, the causative agent of human African trypanosomiasis. We systematically extracted the polyamine pathway from the complete metabolic network while still maintaining the predictive capability of the pathway. The kinetic model is constructed on the basis of information gleaned from the experimental biology literature and defined as a set of ordinary differential equations. We applied Michaelis-Menten kinetics featuring regulatory factors to describe enzymatic activities that are well defined. Uncharacterised enzyme kinetics were approximated and justified with available physiological properties of the system. Optimisation-based dynamic simulations were performed to train the model with experimental data and inconsistent predictions prompted an iterative procedure of model refinement. Good agreement between simulation results and measured data reported in various experimental conditions shows that the model has good applicability in spite of there being gaps in the required data. With this kinetic model, the relative importance of the individual pathway enzymes was assessed. We observed that, at low-to-moderate levels of inhibition, enzymes catalysing reactions of de novo AdoMet (MAT) and ornithine production (OrnPt) have more efficient inhibitory effect on total trypanothione content in comparison to other enzymes in the pathway. In our model, prozyme and TSHSyn (the production catalyst of total trypanothione) were also found to exhibit potent control on total trypanothione content but only when they were strongly inhibited. Different chemotherapeutic strategies against T. brucei were investigated using this model and interruption of polyamine synthesis via joint inhibition of MAT or OrnPt together with other polyamine enzymes was identified as an optimal therapeutic strategy.The work was carried out under a PhD programme partly funded by Prof. Ray Welland, School of Computing Science, University of Glasgo

    Paleophysical Oceanography with an Emphasis on Transport Rates

    Get PDF
    Paleophysical oceanography is the study of the behavior of the fluid ocean of the past, with a specific emphasis on its climate implications, leading to a focus on the general circulation. Even if the circulation is not of primary concern, heavy reliance on deep-sea cores for past climate information means that knowledge of the oceanic state when the sediments were laid down is a necessity. Like the modern problem, paleoceanography depends heavily on observations, and central difficulties lie with the very limited data types and coverage that are, and perhaps ever will be, available. An approximate separation can be made into static descriptors of the circulation (e.g., its water-mass properties and volumes) and the more difficult problem of determining transport rates of mass and other properties. Determination of the circulation of the Last Glacial Maximum is used to outline some of the main challenges to progress. Apart from sampling issues, major difficulties lie with physical interpretation of the proxies, transferring core depths to an accurate timescale (the “age-model problem”), and understanding the accuracy of time-stepping oceanic or coupled-climate models when run unconstrained by observations. Despite the existence of many plausible explanatory scenarios, few features of the paleocirculation in any period are yet known with certainty.National Science Foundation (U.S.) (grant OCE-0645936

    Single view silhouette fitting techniques for estimating tennis racket position

    Get PDF
    Stereo camera systems have been used to track markers attached to a racket, allowing its position to be obtained in three-dimensional (3D) space. Typically, markers are manually selected on the image plane, but this can be time-consuming. A markerless system based on one stationary camera estimating 3D racket position data is desirable for research and play. The markerless method presented in this paper relies on a set of racket silhouette views in a common reference frame captured with a calibrated camera and a silhouette of a racket captured with a camera whose relative pose is outside the common reference frame. The aim of this paper is to provide validation of these single view fitting techniques to estimate the pose of a tennis racket. This includes the development of a calibration method to provide the relative pose of a stationary camera with respect to a racket. Mean static racket position was reconstructed to within ±2 mm. Computer generated camera poses and silhouette views of a full size racket model were used to demonstrate the potential of the method to estimate 3D racket position during a simplified serve scenario. From a camera distance of 14 m, 3D racket position was estimated providing a spatial accuracy of 1.9 ± 0.14 mm, similar to recent 3D video marker tracking studies of tennis
    corecore