1,467 research outputs found

    Long-time Low-latency Quantum Memory by Dynamical Decoupling

    Get PDF
    Quantum memory is a central component for quantum information processing devices, and will be required to provide high-fidelity storage of arbitrary states, long storage times and small access latencies. Despite growing interest in applying physical-layer error-suppression strategies to boost fidelities, it has not previously been possible to meet such competing demands with a single approach. Here we use an experimentally validated theoretical framework to identify periodic repetition of a high-order dynamical decoupling sequence as a systematic strategy to meet these challenges. We provide analytic bounds-validated by numerical calculations-on the characteristics of the relevant control sequences and show that a "stroboscopic saturation" of coherence, or coherence plateau, can be engineered, even in the presence of experimental imperfection. This permits high-fidelity storage for times that can be exceptionally long, meaning that our device-independent results should prove instrumental in producing practically useful quantum technologies.Comment: abstract and authors list fixe

    Generation and Suppression of Decoherence in Artificial Environment for Qubit System

    Get PDF
    It is known that a quantum system with finite degrees of freedom can simulate a composite of a system and an environment if the state of the hypothetical environment is randomized by external manipulation. We show theoretically that any phase decoherence phenomena of a single qubit can be simulated with a two-qubit system and demonstrate experimentally two examples: one is phase decoherence of a single qubit in a transmission line, and the other is that in a quantum memory. We perform NMR experiments employing a two-spin molecule and clearly measure decoherence for both cases. We also prove experimentally that the bang-bang control efficiently suppresses decoherence.Comment: 25 pages, 7 figures; added reference

    Decoherence-protected quantum gates for a hybrid solid-state spin register

    Full text link
    Protecting the dynamics of coupled quantum systems from decoherence by the environment is a key challenge for solid-state quantum information processing. An idle qubit can be efficiently insulated from the outside world via dynamical decoupling, as has recently been demonstrated for individual solid-state qubits. However, protection of qubit coherence during a multi-qubit gate poses a non-trivial problem: in general the decoupling disrupts the inter-qubit dynamics, and hence conflicts with gate operation. This problem is particularly salient for hybrid systems, wherein different types of qubits evolve and decohere at vastly different rates. Here we present the integration of dynamical decoupling into quantum gates for a paradigmatic hybrid system, the electron-nuclear spin register. Our design harnesses the internal resonance in the coupled-spin system to resolve the conflict between gate operation and decoupling. We experimentally demonstrate these gates on a two-qubit register in diamond operating at room temperature. Quantum tomography reveals that the qubits involved in the gate operation are protected as accurately as idle qubits. We further illustrate the power of our design by executing Grover's quantum search algorithm, achieving fidelities above 90% even though the execution time exceeds the electron spin dephasing time by two orders of magnitude. Our results directly enable decoherence-protected interface gates between different types of promising solid-state qubits. Ultimately, quantum gates with integrated decoupling may enable reaching the accuracy threshold for fault-tolerant quantum information processing with solid-state devices.Comment: This is original submitted version of the paper. The revised and finalized version is in print, and is subjected to the embargo and other editorial restrictions of the Nature journa

    World radiocommunication conference 12 : implications for the spectrum eco-system

    Get PDF
    Spectrum allocation is once more a key issue facing the global telecommunications industry. Largely overlooked in current debates, however, is the World Radiocommunication Conference (WRC). Decisions taken by WRC shape the future roadmap of the telecommunications industry, not least because it has the ability to shape the global spectrum allocation framework. In the debates of WRC-12 it is possible to identify three main issues: enhancement of the international spectrum regulatory framework, regulatory measures required to introduce Cognitive Radio Systems (CRS) technologies; and, additional spectrum allocation to mobile service. WRC-12 eventually decided not to change the current international radio regulations with regard to the first two issues and agreed to the third issue. The main implications of WRC-12 on the spectrum ecosystem are that most of actors are not in support of the concept of spectrum flexibility associated with trading and that the concept of spectrum open access is not under consideration. This is explained by the observation that spectrum trading and spectrum commons weaken state control over spectrum and challenge the main principles and norms of the international spectrum management regime. In addition, the mobile allocation issue has shown the lack of conformity with the main rules of the regime: regional spectrum allocation in the International Telecommunication Union (ITU) three regions, and the resistance to the slow decision making procedures. In conclusion, while the rules and decision-making procedures of the international spectrum management regime were challenged in the WRC-12, the main principles and norms are still accepted by the majority of countries

    Measuring the predictability of life outcomes with a scientific mass collaboration.

    Get PDF
    How predictable are life trajectories? We investigated this question with a scientific mass collaboration using the common task method; 160 teams built predictive models for six life outcomes using data from the Fragile Families and Child Wellbeing Study, a high-quality birth cohort study. Despite using a rich dataset and applying machine-learning methods optimized for prediction, the best predictions were not very accurate and were only slightly better than those from a simple benchmark model. Within each outcome, prediction error was strongly associated with the family being predicted and weakly associated with the technique used to generate the prediction. Overall, these results suggest practical limits to the predictability of life outcomes in some settings and illustrate the value of mass collaborations in the social sciences

    Multimessenger NuEM Alerts with AMON

    Get PDF
    The Astrophysical Multimessenger Observatory Network (AMON), has developed a real-time multi-messenger alert system. The system performs coincidence analyses of datasets from gamma-ray and neutrino detectors, making the Neutrino-Electromagnetic (NuEM) alert channel. For these analyses, AMON takes advantage of sub-threshold events, i.e., events that by themselves are not significant in the individual detectors. The main purpose of this channel is to search for gamma-ray counterparts of neutrino events. We will describe the different analyses that make-up this channel and present a selection of recent results

    Guidelines for the use and interpretation of assays for monitoring autophagy (3rd edition)

    Get PDF
    In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure fl ux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defi ned as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (inmost higher eukaryotes and some protists such as Dictyostelium ) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the fi eld understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation it is imperative to delete or knock down more than one autophagy-related gene. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways so not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field

    AI is a viable alternative to high throughput screening: a 318-target study

    Get PDF
    : High throughput screening (HTS) is routinely used to identify bioactive small molecules. This requires physical compounds, which limits coverage of accessible chemical space. Computational approaches combined with vast on-demand chemical libraries can access far greater chemical space, provided that the predictive accuracy is sufficient to identify useful molecules. Through the largest and most diverse virtual HTS campaign reported to date, comprising 318 individual projects, we demonstrate that our AtomNet® convolutional neural network successfully finds novel hits across every major therapeutic area and protein class. We address historical limitations of computational screening by demonstrating success for target proteins without known binders, high-quality X-ray crystal structures, or manual cherry-picking of compounds. We show that the molecules selected by the AtomNet® model are novel drug-like scaffolds rather than minor modifications to known bioactive compounds. Our empirical results suggest that computational methods can substantially replace HTS as the first step of small-molecule drug discovery
    corecore