200 research outputs found

    Measuring Measurement: Theory and Practice

    Full text link
    Recent efforts have applied quantum tomography techniques to the calibration and characterization of complex quantum detectors using minimal assumptions. In this work we provide detail and insight concerning the formalism, the experimental and theoretical challenges and the scope of these tomographical tools. Our focus is on the detection of photons with avalanche photodiodes and photon number resolving detectors and our approach is to fully characterize the quantum operators describing these detectors with a minimal set of well specified assumptions. The formalism is completely general and can be applied to a wide range of detectorsComment: 22 pages, 27 figure

    Photon Number Statistics of Multimode Parametric Down-Conversion

    Full text link
    We experimentally analyze the complete photon number statistics of parametric downconversion and ascertain the influence of multimode effects. Our results clearly reveal a difference between single mode theoretical description and the measured distributions. Further investigations assure the applicability of loss-tolerant photon number reconstruction and prove strict photon number correlation between signal and idler modes.Comment: 5 pages, 3 figure

    Manipulating the quantum information of the radial modes of trapped ions: Linear phononics, entanglement generation, quantum state transmission and non-locality tests

    Full text link
    We present a detailed study on the possibility of manipulating quantum information encoded in the "radial" modes of arrays of trapped ions (i.e., in the ions' oscillations orthogonal to the trap's main axis). In such systems, because of the tightness of transverse confinement, the radial modes pertaining to different ions can be addressed individually. In the first part of the paper we show that, if local control of the radial trapping frequencies is available, any linear optical and squeezing operation on the locally defined modes - on single as well as on many modes - can be reproduced by manipulating the frequencies. Then, we proceed to describe schemes apt to generate unprecedented degrees of bipartite and multipartite continuous variable entanglement under realistic noisy working conditions, and even restricting only to a global control of the trapping frequencies. Furthermore, we consider the transmission of the quantum information encoded in the radial modes along the array of ions, and show it to be possible to a remarkable degree of accuracy, for both finite-dimensional and continuous variable quantum states. Finally, as an application, we show that the states which can be generated in this setting allow for the violation of multipartite non-locality tests, by feasible displaced parity measurements. Such a demonstration would be a first test of quantum non-locality for "massive" degrees of freedom (i.e., for degrees of freedom describing the motion of massive particles).Comment: 21 pages; this paper, presenting a far more extensive and detailed analysis, completely supersedes arXiv:0708.085

    Avalanche Photo-Detection for High Data Rate Applications

    Full text link
    Avalanche photo detection is commonly used in applications which require single photon sensitivity. We examine the limits of using avalanche photo diodes (APD) for characterising photon statistics at high data rates. To identify the regime of linear APD operation we employ a ps-pulsed diode laser with variable repetition rates between 0.5MHz and 80MHz. We modify the mean optical power of the coherent pulses by applying different levels of well-calibrated attenuation. The linearity at high repetition rates is limited by the APD dead time and a non-linear response arises at higher photon-numbers due to multiphoton events. Assuming Poissonian input light statistics we ascertain the effective mean photon-number of the incident light with high accuracy. Time multiplexed detectors (TMD) allow to accomplish photon- number resolution by photon chopping. This detection setup extends the linear response function to higher photon-numbers and statistical methods may be used to compensate for non-linearity. We investigated this effect, compare it to the single APD case and show the validity of the convolution treatment in the TMD data analysis.Comment: 16 pages, 5 figure

    Reduced admixture of North Atlantic Deep Water to the deep central South Pacific during the last two glacial periods

    Get PDF
    Key Points: ‱ Little deep water circulation changes in the past 240,000 years in the central South Pacific ‱ Reduced North Atlantic Deep Water admixture during glacials to the Southern Ocean ‱ South Pacific lithogenic material mainly sourced from SE Australia and South New Zealand The South Pacific is a sensitive location for the variability of the global oceanic thermohaline circulation given that deep waters from the Atlantic Ocean, the Southern Ocean, and the Pacific basin are exchanged. Here we reconstruct the deep-water circulation of the central South Pacific for the last two glacial cycles (from 240,000 years ago to the Holocene) based on radiogenic neodymium (Nd) and lead (Pb) isotope records complemented by benthic stable carbon data obtained from two sediment cores located on the flanks of the East Pacific Rise. The records show small but consistent glacial/interglacial changes in all three isotopic systems with interglacial average values of -5.8 and 18.757 for ΔNd and 206Pb/204Pb, respectively, whereas glacial averages are -5.3 and 18.744. Comparison of this variability of Circumpolar Deep Water (CDW) to previously published records along the pathway of the global thermohaline circulation is consistent with reduced admixture of North Atlantic Deep Water (NADW) to CDW during cold stages. The absolute values and amplitudes of the benthic ÎŽ13C variations are essentially indistinguishable from other records of the Southern Hemisphere and confirm that the low central South Pacific sedimentation rates did not result in a significant reduction of the amplitude of any of the measured proxies. In addition, the combined detrital Nd and strontium (87Sr/86Sr) isotope signatures imply that Australian and New Zealand dust has remained the principal contributor of lithogenic material to the central South Pacific

    Measuring measurement

    Full text link
    Measurement connects the world of quantum phenomena to the world of classical events. It plays both a passive role, observing quantum systems, and an active one, preparing quantum states and controlling them. Surprisingly - in the light of the central status of measurement in quantum mechanics - there is no general recipe for designing a detector that measures a given observable. Compounding this, the characterization of existing detectors is typically based on partial calibrations or elaborate models. Thus, experimental specification (i.e. tomography) of a detector is of fundamental and practical importance. Here, we present the realization of quantum detector tomography: we identify the optimal positive-operator-valued measure describing the detector, with no ancillary assumptions. This result completes the triad, state, process, and detector tomography, required to fully specify an experiment. We characterize an avalanche photodiode and a photon number resolving detector capable of detecting up to eight photons. This creates a new set of tools for accurately detecting and preparing non-classical light.Comment: 6 pages, 4 figures,see video abstract at http://www.quantiki.org/video_abstracts/0807244

    Detector decoy quantum key distribution

    Full text link
    Photon number resolving detectors can enhance the performance of many practical quantum cryptographic setups. In this paper, we employ a simple method to estimate the statistics provided by such a photon number resolving detector using only a threshold detector together with a variable attenuator. This idea is similar in spirit to that of the decoy state technique, and is specially suited for those scenarios where only a few parameters of the photon number statistics of the incoming signals have to be estimated. As an illustration of the potential applicability of the method in quantum communication protocols, we use it to prove security of an entanglement based quantum key distribution scheme with an untrusted source without the need of a squash model and by solely using this extra idea. In this sense, this detector decoy method can be seen as a different conceptual approach to adapt a single photon security proof to its physical, full optical implementation. We show that in this scenario the legitimate users can now even discard the double click events from the raw key data without compromising the security of the scheme, and we present simulations on the performance of the BB84 and the 6-state quantum key distribution protocols.Comment: 27 pages, 7 figure

    Integrated Photonic Sensing

    Full text link
    Loss is a critical roadblock to achieving photonic quantum-enhanced technologies. We explore a modular platform for implementing integrated photonics experiments and consider the effects of loss at different stages of these experiments, including state preparation, manipulation and measurement. We frame our discussion mainly in the context of quantum sensing and focus particularly on the use of loss-tolerant Holland-Burnett states for optical phase estimation. In particular, we discuss spontaneous four-wave mixing in standard birefringent fibre as a source of pure, heralded single photons and present methods of optimising such sources. We also outline a route to programmable circuits which allow the control of photonic interactions even in the presence of fabrication imperfections and describe a ratiometric characterisation method for beam splitters which allows the characterisation of complex circuits without the need for full process tomography. Finally, we present a framework for performing state tomography on heralded states using lossy measurement devices. This is motivated by a calculation of the effects of fabrication imperfections on precision measurement using Holland-Burnett states.Comment: 19 pages, 7 figure

    On Sustainable Ring-based Anonymous Systems

    Get PDF
    Anonymous systems (e.g. anonymous cryptocurrencies and updatable anonymous credentials) often follow a construction template where an account can only perform a single anonymous action, which in turn potentially spawns new (and still single-use) accounts (e.g. UTXO with a balance to spend or session with a score to claim). Due to the anonymous nature of the action, no party can be sure which account has taken part in an action and, therefore, must maintain an ever-growing list of potentially unused accounts to ensure that the system keeps running correctly. Consequently, anonymous systems constructed based on this common template are seemingly not sustainable. In this work, we study the sustainability of ring-based anonymous systems, where a user performing an anonymous action is hidden within a set of decoy users, traditionally called a ``ring\u27\u27. On the positive side, we propose a general technique for ring-based anonymous systems to achieve sustainability. Along the way, we define a general model of decentralised anonymous systems (DAS) for arbitrary anonymous actions, and provide a generic construction which provably achieves sustainability. As a special case, we obtain the first construction of anonymous cryptocurrencies achieving sustainability without compromising availability. We also demonstrate the generality of our model by constructing sustainable decentralised anonymous social networks. On the negative side, we show empirically that Monero, one of the most popular anonymous cryptocurrencies, is unlikely to be sustainable without altering its current ring sampling strategy. The main subroutine is a sub-quadratic-time algorithm for detecting used accounts in a ring-based anonymous system

    Antiphased dust deposition and productivity in the Antarctic Zone over 1.5 million years

    Get PDF
    The Southern Ocean paleoceanography provides key insights into how iron fertilization and oceanic productivity developed through Pleistocene ice-ages and their role in influencing the carbon cycle. We report a high-resolution record of dust deposition and ocean productivity for the Antarctic Zone, close to the main dust source, Patagonia. Our deep-ocean records cover the last 1.5 Ma, thus doubling that from Antarctic ice-cores. We find a 5 to 15-fold increase in dust deposition during glacials and a 2 to 5-fold increase in biogenic silica deposition, reflecting higher ocean productivity during interglacials. This antiphasing persisted throughout the last 25 glacial cycles. Dust deposition became more pronounced across the Mid-Pleistocene Transition (MPT) in the Southern Hemisphere, with an abrupt shift suggesting more severe glaciations since ~0.9 Ma. Productivity was intermediate pre-MPT, lowest during the MPT and highest since 0.4 Ma. Generally, glacials experienced extended sea-ice cover, reduced bottom-water export and Weddell Gyre dynamics, which helped lower atmospheric CO2 levels.Postprin
    • 

    corecore