39 research outputs found

    The Long-Baseline Neutrino Experiment: Exploring Fundamental Symmetries of the Universe

    Get PDF
    The preponderance of matter over antimatter in the early Universe, the dynamics of the supernova bursts that produced the heavy elements necessary for life and whether protons eventually decay --- these mysteries at the forefront of particle physics and astrophysics are key to understanding the early evolution of our Universe, its current state and its eventual fate. The Long-Baseline Neutrino Experiment (LBNE) represents an extensively developed plan for a world-class experiment dedicated to addressing these questions. LBNE is conceived around three central components: (1) a new, high-intensity neutrino source generated from a megawatt-class proton accelerator at Fermi National Accelerator Laboratory, (2) a near neutrino detector just downstream of the source, and (3) a massive liquid argon time-projection chamber deployed as a far detector deep underground at the Sanford Underground Research Facility. This facility, located at the site of the former Homestake Mine in Lead, South Dakota, is approximately 1,300 km from the neutrino source at Fermilab -- a distance (baseline) that delivers optimal sensitivity to neutrino charge-parity symmetry violation and mass ordering effects. This ambitious yet cost-effective design incorporates scalability and flexibility and can accommodate a variety of upgrades and contributions. With its exceptional combination of experimental configuration, technical capabilities, and potential for transformative discoveries, LBNE promises to be a vital facility for the field of particle physics worldwide, providing physicists from around the globe with opportunities to collaborate in a twenty to thirty year program of exciting science. In this document we provide a comprehensive overview of LBNE's scientific objectives, its place in the landscape of neutrino physics worldwide, the technologies it will incorporate and the capabilities it will possess.Comment: Major update of previous version. This is the reference document for LBNE science program and current status. Chapters 1, 3, and 9 provide a comprehensive overview of LBNE's scientific objectives, its place in the landscape of neutrino physics worldwide, the technologies it will incorporate and the capabilities it will possess. 288 pages, 116 figure

    Overcoming leakage in scalable quantum error correction

    Full text link
    Leakage of quantum information out of computational states into higher energy states represents a major challenge in the pursuit of quantum error correction (QEC). In a QEC circuit, leakage builds over time and spreads through multi-qubit interactions. This leads to correlated errors that degrade the exponential suppression of logical error with scale, challenging the feasibility of QEC as a path towards fault-tolerant quantum computation. Here, we demonstrate the execution of a distance-3 surface code and distance-21 bit-flip code on a Sycamore quantum processor where leakage is removed from all qubits in each cycle. This shortens the lifetime of leakage and curtails its ability to spread and induce correlated errors. We report a ten-fold reduction in steady-state leakage population on the data qubits encoding the logical state and an average leakage population of less than 1×1031 \times 10^{-3} throughout the entire device. The leakage removal process itself efficiently returns leakage population back to the computational basis, and adding it to a code circuit prevents leakage from inducing correlated error across cycles, restoring a fundamental assumption of QEC. With this demonstration that leakage can be contained, we resolve a key challenge for practical QEC at scale.Comment: Main text: 7 pages, 5 figure

    Suppressing quantum errors by scaling a surface code logical qubit

    Full text link
    Practical quantum computing will require error rates that are well below what is achievable with physical qubits. Quantum error correction offers a path to algorithmically-relevant error rates by encoding logical qubits within many physical qubits, where increasing the number of physical qubits enhances protection against physical errors. However, introducing more qubits also increases the number of error sources, so the density of errors must be sufficiently low in order for logical performance to improve with increasing code size. Here, we report the measurement of logical qubit performance scaling across multiple code sizes, and demonstrate that our system of superconducting qubits has sufficient performance to overcome the additional errors from increasing qubit number. We find our distance-5 surface code logical qubit modestly outperforms an ensemble of distance-3 logical qubits on average, both in terms of logical error probability over 25 cycles and logical error per cycle (2.914%±0.016%2.914\%\pm 0.016\% compared to 3.028%±0.023%3.028\%\pm 0.023\%). To investigate damaging, low-probability error sources, we run a distance-25 repetition code and observe a 1.7×1061.7\times10^{-6} logical error per round floor set by a single high-energy event (1.6×1071.6\times10^{-7} when excluding this event). We are able to accurately model our experiment, and from this model we can extract error budgets that highlight the biggest challenges for future systems. These results mark the first experimental demonstration where quantum error correction begins to improve performance with increasing qubit number, illuminating the path to reaching the logical error rates required for computation.Comment: Main text: 6 pages, 4 figures. v2: Update author list, references, Fig. S12, Table I

    Measurement-induced entanglement and teleportation on a noisy quantum processor

    Full text link
    Measurement has a special role in quantum theory: by collapsing the wavefunction it can enable phenomena such as teleportation and thereby alter the "arrow of time" that constrains unitary evolution. When integrated in many-body dynamics, measurements can lead to emergent patterns of quantum information in space-time that go beyond established paradigms for characterizing phases, either in or out of equilibrium. On present-day NISQ processors, the experimental realization of this physics is challenging due to noise, hardware limitations, and the stochastic nature of quantum measurement. Here we address each of these experimental challenges and investigate measurement-induced quantum information phases on up to 70 superconducting qubits. By leveraging the interchangeability of space and time, we use a duality mapping, to avoid mid-circuit measurement and access different manifestations of the underlying phases -- from entanglement scaling to measurement-induced teleportation -- in a unified way. We obtain finite-size signatures of a phase transition with a decoding protocol that correlates the experimental measurement record with classical simulation data. The phases display sharply different sensitivity to noise, which we exploit to turn an inherent hardware limitation into a useful diagnostic. Our work demonstrates an approach to realize measurement-induced physics at scales that are at the limits of current NISQ processors

    Two-particle correlations in azimuthal angle and pseudorapidity in inelastic p + p interactions at the CERN Super Proton Synchrotron

    Get PDF
    Results on two-particle ΔηΔϕ correlations in inelastic p + p interactions at 20, 31, 40, 80, and 158 GeV/c are presented. The measurements were performed using the large acceptance NA61/SHINE hadron spectrometer at the CERN Super Proton Synchrotron. The data show structures which can be attributed mainly to effects of resonance decays, momentum conservation, and quantum statistics. The results are compared with the Epos and UrQMD models.ISSN:1434-6044ISSN:1434-605

    Non-Abelian braiding of graph vertices in a superconducting processor

    Full text link
    Indistinguishability of particles is a fundamental principle of quantum mechanics. For all elementary and quasiparticles observed to date - including fermions, bosons, and Abelian anyons - this principle guarantees that the braiding of identical particles leaves the system unchanged. However, in two spatial dimensions, an intriguing possibility exists: braiding of non-Abelian anyons causes rotations in a space of topologically degenerate wavefunctions. Hence, it can change the observables of the system without violating the principle of indistinguishability. Despite the well developed mathematical description of non-Abelian anyons and numerous theoretical proposals, the experimental observation of their exchange statistics has remained elusive for decades. Controllable many-body quantum states generated on quantum processors offer another path for exploring these fundamental phenomena. While efforts on conventional solid-state platforms typically involve Hamiltonian dynamics of quasi-particles, superconducting quantum processors allow for directly manipulating the many-body wavefunction via unitary gates. Building on predictions that stabilizer codes can host projective non-Abelian Ising anyons, we implement a generalized stabilizer code and unitary protocol to create and braid them. This allows us to experimentally verify the fusion rules of the anyons and braid them to realize their statistics. We then study the prospect of employing the anyons for quantum computation and utilize braiding to create an entangled state of anyons encoding three logical qubits. Our work provides new insights about non-Abelian braiding and - through the future inclusion of error correction to achieve topological protection - could open a path toward fault-tolerant quantum computing

    The neutron and its role in cosmology and particle physics

    Full text link
    Experiments with cold and ultracold neutrons have reached a level of precision such that problems far beyond the scale of the present Standard Model of particle physics become accessible to experimental investigation. Due to the close links between particle physics and cosmology, these studies also permit a deep look into the very first instances of our universe. First addressed in this article, both in theory and experiment, is the problem of baryogenesis ... The question how baryogenesis could have happened is open to experimental tests, and it turns out that this problem can be curbed by the very stringent limits on an electric dipole moment of the neutron, a quantity that also has deep implications for particle physics. Then we discuss the recent spectacular observation of neutron quantization in the earth's gravitational field and of resonance transitions between such gravitational energy states. These measurements, together with new evaluations of neutron scattering data, set new constraints on deviations from Newton's gravitational law at the picometer scale. Such deviations are predicted in modern theories with extra-dimensions that propose unification of the Planck scale with the scale of the Standard Model ... Another main topic is the weak-interaction parameters in various fields of physics and astrophysics that must all be derived from measured neutron decay data. Up to now, about 10 different neutron decay observables have been measured, much more than needed in the electroweak Standard Model. This allows various precise tests for new physics beyond the Standard Model, competing with or surpassing similar tests at high-energy. The review ends with a discussion of neutron and nuclear data required in the synthesis of the elements during the "first three minutes" and later on in stellar nucleosynthesis.Comment: 91 pages, 30 figures, accepted by Reviews of Modern Physic

    5- tert

    No full text

    HIV-1 Tat phosphorylation on Ser-16 residue modulates HIV-1 transcription

    No full text
    Background: HIV-1 transcription activator protein Tat is phosphorylated in vitro by CDK2 and DNA-PK on Ser-16 residue and by PKR on Tat Ser-46 residue. Here we analyzed Tat phosphorylation in cultured cells and its functionality. Results: Mass spectrometry analysis showed primarily Tat Ser-16 phosphorylation in cultured cells. In vitro, CDK2/cyclin E predominantly phosphorylated Tat Ser-16 and PKR-Tat Ser-46. Alanine mutations of either Ser-16 or Ser-46 decreased overall Tat phosphorylation. Phosphorylation of Tat Ser-16 was reduced in cultured cells treated by a small molecule inhibitor of CDK2 and, to a lesser extent, an inhibitor of DNA-PK. Conditional knock-downs of CDK2 and PKR inhibited and induced one round HIV-1 replication respectively. HIV-1 proviral transcription was inhibited by Tat alanine mutants and partially restored by S16E mutation. Pseudotyped HIV-1 with Tat S16E mutation replicated well, and HIV-1 Tat S46E-poorly, but no live viruses were obtained with Tat S16A or Tat S46A mutations. TAR RNA binding was affected by Tat Ser-16 alanine mutation. Binding to cyclin T1 showed decreased binding of all Ser-16 and Ser-46 Tat mutants with S16D and Tat S46D mutationts showing the strongest effect. Molecular modelling and molecular dynamic analysis revealed significant structural changes in Tat/CDK9/cyclin T1 complex with phosphorylated Ser-16 residue, but not with phosphorylated Ser-46 residue. Conclusion: Phosphorylation of Tat Ser-16 induces HIV-1 transcription, facilitates binding to TAR RNA and rearranges CDK9/cyclin T1/Tat complex. Thus, phosphorylation of Tat Ser-16 regulates HIV-1 transcription and may serve as target for HIV-1 therapeutics
    corecore