12,637 research outputs found

    Topological code Autotune

    Full text link
    Many quantum systems are being investigated in the hope of building a large-scale quantum computer. All of these systems suffer from decoherence, resulting in errors during the execution of quantum gates. Quantum error correction enables reliable quantum computation given unreliable hardware. Unoptimized topological quantum error correction (TQEC), while still effective, performs very suboptimally, especially at low error rates. Hand optimizing the classical processing associated with a TQEC scheme for a specific system to achieve better error tolerance can be extremely laborious. We describe a tool Autotune capable of performing this optimization automatically, and give two highly distinct examples of its use and extreme outperformance of unoptimized TQEC. Autotune is designed to facilitate the precise study of real hardware running TQEC with every quantum gate having a realistic, physics-based error model.Comment: 13 pages, 17 figures, version accepted for publicatio

    Quantum computing with nearest neighbor interactions and error rates over 1%

    Full text link
    Large-scale quantum computation will only be achieved if experimentally implementable quantum error correction procedures are devised that can tolerate experimentally achievable error rates. We describe a quantum error correction procedure that requires only a 2-D square lattice of qubits that can interact with their nearest neighbors, yet can tolerate quantum gate error rates over 1%. The precise maximum tolerable error rate depends on the error model, and we calculate values in the range 1.1--1.4% for various physically reasonable models. Even the lowest value represents the highest threshold error rate calculated to date in a geometrically constrained setting, and a 50% improvement over the previous record.Comment: 4 pages, 8 figure

    Subglacial floods beneath ice sheets.

    Get PDF
    Subglacial floods (jökulhlaups) are well documented as occurring beneath present day glaciers and ice caps. In addition, it is known that massive floods have occurred from ice-dammed lakes proximal to the Laurentide ice sheet during the last ice age, and it has been suggested that at least one such flood below the waning ice sheet was responsible for a dramatic cooling event some 8000 years ago. We propose that drainage of lakes from beneath ice sheets will generally occur in a time-periodic fashion, and that such floods can be of severe magnitude. Such hydraulic eruptions are likely to have caused severe climatic disturbances in the past, and may well do so in the future

    The development of a model to infer precipitation from microwave measurements

    Get PDF
    To permit the inference of precipitation amounts from radiometric measurements, a radiative interaction model was developed. This model uses a simple computational scheme to determine the effects of rain upon brightness temperatures and can be used with a statistical inversion procedure to invert for rain rate. Precipitating cloud models was also developed and used with the microwave model for frequencies of 19.35 and 37 GHz to determine the variability of the microwave-rain rate relationship on a global and seasonal basis

    Estimation of the geophysical properties of the ocean surface using aircraft microwave measurements

    Get PDF
    An improved model of the effects of sea state on microwave signature has been developed which incorporates the different effects of whitecaps and streaks to define the response of microwave channels to wind speed. This model has been demonstrated to agree with recent measurements. An approximation model has also been incorporated to describe the effects of precipitation on microwave radiation through a computationally rapid routine. The use of these models and a new technique to allow the selection of the most climatologically appropriate D-matrix is demonstrated in the inversion of data collected over the bering Sea. Surface wind speed agrees very well with observations while good results are obtained for integrated water vapor, and liquid water

    Emotional dysfunction in schizophrenia spectrum psychosis: the role of illness perceptions

    Get PDF
    Background. Assessing illness perceptions has been useful in a range of medical disorders. This study of people with a recent relapse of their psychosis examines the relationship between illness perception, their emotional responses and their attitudes to medication.Method. One hundred patients diagnosed with a non-affective psychotic disorder were assessed within 3 months of relapse. Measures included insight, self-reported. illness perceptions, medication adherence, depression, self-esteem and anxiety.Results. Illness perceptions about psychosis explained 46, 36 and 34% of the variance in depression, anxiety and self-esteem respectively. However, self-reported medication adherence was more strongly associated with a measure of insight.Conclusions. Negative illness perceptions in psychosis are clearly related to depression, anxiety and self-esteem. These in turn have been linked to symptom maintenance and recurrence. Clinical interventions that foster appraisals of recovery rather than of chronicity and severity may therefore improve emotional well-being in people with psychosis. It might be better to address adherence to medication through direct attempts at helping them understand their need for treatment

    Quantum Error Correction on Linear Nearest Neighbor Qubit Arrays

    Get PDF
    A minimal depth quantum circuit implementing 5-qubit quantum error correction in a manner optimized for a linear nearest neighbor architecture is described. The canonical decomposition is used to construct fast and simple gates that incorporate the necessary swap operations. Simulations of the circuit's performance when subjected to discrete and continuous errors are presented. The relationship between the error rate of a physical qubit and that of a logical qubit is investigated with emphasis on determining the concatenated error correction threshold.Comment: 4 pages, 5 figure

    Novel approach for deriving genome wide SNP analysis data from archived blood spots.

    Get PDF
    RIGHTS : This article is licensed under the BioMed Central licence at http://www.biomedcentral.com/about/license which is similar to the 'Creative Commons Attribution Licence'. In brief you may : copy, distribute, and display the work; make derivative works; or make commercial use of the work - under the following conditions: the original author must be given credit; for any reuse or distribution, it must be made clear to others what the license terms of this work are.BACKGROUND: The ability to transport and store DNA at room temperature in low volumes has the advantage of optimising cost, time and storage space. Blood spots on adapted filter papers are popular for this, with FTA (Flinders Technology Associates) Whatman™TM technology being one of the most recent. Plant material, plasmids, viral particles, bacteria and animal blood have been stored and transported successfully using this technology, however the method of porcine DNA extraction from FTA Whatman™TM cards is a relatively new approach, allowing nucleic acids to be ready for downstream applications such as PCR, whole genome amplification, sequencing and subsequent application to single nucleotide polymorphism microarrays has hitherto been under-explored. FINDINGS: DNA was extracted from FTA Whatman™TM cards (following adaptations of the manufacturer's instructions), whole genome amplified and subsequently analysed to validate the integrity of the DNA for downstream SNP analysis. DNA was successfully extracted from 288/288 samples and amplified by WGA. Allele dropout post WGA, was observed in less than 2% of samples and there was no clear evidence of amplification bias nor contamination. Acceptable call rates on porcine SNP chips were also achieved using DNA extracted and amplified in this way. CONCLUSIONS: DNA extracted from FTA Whatman cards is of a high enough quality and quantity following whole genomic amplification to perform meaningful SNP chip studies

    Studies in the use of cloud type statistics in mission simulation

    Get PDF
    A study to further improve NASA's global cloud statistics for mission simulation is reported. Regional homogeneity in cloud types was examined; most of the original region boundaries defined for cloud cover amount in previous studies were supported by the statistics on cloud types and the number of cloud layers. Conditionality in cloud statistics was also examined with special emphasis on temporal and spatial dependencies, and cloud type interdependence. Temporal conditionality was found up to 12 hours, and spatial conditionality up to 200 miles; the diurnal cycle in convective cloudiness was clearly evident. As expected, the joint occurrence of different cloud types reflected the dynamic processes which form the clouds. Other phases of the study improved the cloud type statistics for several region and proposed a mission simulation scheme combining the 4-dimensional atmospheric model, sponsored by MSFC, with the global cloud model
    corecore