330 research outputs found

    Fragile boundaries of tailored surface codes

    Full text link
    Biased noise is common in physical qubits, and tailoring a quantum code to the bias by locally modifying stabilizers or changing boundary conditions has been shown to greatly increase error correction thresholds. In this work, we explore the challenges of using a specific tailored code, the XY surface code, for fault-tolerant quantum computation. We introduce efficient and fault-tolerant decoders, belief-matching and belief-find, which exploit correlated hyperedge fault mechanisms present in circuit-level noise. Using belief-matching, we find that the XY surface code has a higher threshold and lower overhead than the square CSS surface code for moderately biased noise. However, the rectangular CSS surface code has a lower qubit overhead than the XY surface code when below threshold. We identify a contributor to the reduced performance that we call fragile boundary errors. These are string-like errors that can occur along spatial or temporal boundaries in planar architectures or during logical state preparation and measurement. While we make partial progress towards mitigating these errors by deforming the boundaries of the XY surface code, our work suggests that fragility could remain a significant obstacle, even for other tailored codes. We expect that our decoders will have other uses; belief-find has an almost-linear running time, and we show that it increases the threshold of the surface code to 0.937(2)% in the presence of circuit-level depolarising noise, compared to 0.817(5)% for the more computationally expensive minimum-weight perfect matching decoder.Comment: 16 pages, 17 figure

    Metabolism of ticagrelor in patients with acute coronary syndromes.

    Get PDF
    © The Author(s) 2018Ticagrelor is a state-of-the-art antiplatelet agent used for the treatment of patients with acute coronary syndromes (ACS). Unlike remaining oral P2Y12 receptor inhibitors ticagrelor does not require metabolic activation to exert its antiplatelet action. Still, ticagrelor is extensively metabolized by hepatic CYP3A enzymes, and AR-C124910XX is its only active metabolite. A post hoc analysis of patient-level (n = 117) pharmacokinetic data pooled from two prospective studies was performed to identify clinical characteristics affecting the degree of AR-C124910XX formation during the first six hours after 180 mg ticagrelor loading dose in the setting of ACS. Both linear and multiple regression analyses indicated that ACS patients presenting with ST-elevation myocardial infarction or suffering from diabetes mellitus are more likely to have decreased rate of ticagrelor metabolism during the acute phase of ACS. Administration of morphine during ACS was found to negatively influence transformation of ticagrelor into AR-C124910XX when assessed with linear regression analysis, but not with multiple regression analysis. On the other hand, smoking appears to increase the degree of ticagrelor transformation in ACS patients. Mechanisms underlying our findings and their clinical significance warrant further research.Peer reviewedFinal Published versio

    Who approves/pays for additional monitoring?

    Get PDF
    Major considerations in the provision of healthcare are availability, affordability, accessibility, and appropriateness, especially in the setting of heart failure where disease burden is growing, developments have been rapid and newer biomarkers, diagnostic and imaging techniques, monitoring systems, devices, procedures, and drugs have all been developed in a relatively short period of time. Many monitoring and diagnostic systems have been developed but the disproportionate cost of conducting trials of their effectiveness has limited their uptake. There are added complexities, in that the utilization of doctors for the supervision of the monitoring results may be optimal in one setting and not in another because of differences in the characteristics of organization of healthcare provision, making even interpretation of the trials we have had, still difficult to interpret. New technologies are continuously changing the approach to healthcare and will reshape the structure of the healthcare systems in the future. Mobile technologies can empower patients and carers by giving them more control over their health and social care needs and reducing their dependence on healthcare professionals for monitoring their health, but a significant problem is the integration of the multitude of monitored parameters with clinical data and the recognition of intervention thresholds. Digital technology can help, but we need to prove its cost/efficacy and how it will be paid for. Governments in many European countries and worldwide are trying to establish frameworks that promote the convergence of standards and regulations for telemedicine solutions and yet simultaneously health authorities are closely scrutinizing healthcare spending, with the objective of reducing and optimizing expenditure in the provision of health services. There are multiple factors to be considered for the reimbursement models associated with the implementation of physiological monitoring yet it remains a challenge in cash-strapped health systems

    INVESTIGATION OF THE TRANSMISSION AND STOPPING OF LIGHT IONS PASSING THROUGH A PLASMA TARGET

    No full text
    Transmission and energy losses of 2 MeV/u Carbon and Sulphur beams passing through a plasma target, have been extensively investigated. A hydrogen plasma ignited by an electrical discharge was coupled to the Orsay Tandem beam accelerator. Fluctuations in beam transmission have been observed and attributed to a magnetic focusing effect generated during the plasma evolution. Energy loss measurements were performed on the basis of time of flight techniques and indicate an enhanced stopping power of the plasma relative to its cold matter equivalent

    Low-molecular-weight heparins vs. unfractionated heparin in the setting of percutaneous coronary intervention for ST-elevation myocardial infarction: a meta-analysis

    Get PDF
    Summary. Background: The aim of the current study was to perform two separate meta-analyses of available studies comparing low-molecular-weight heparins (LMWHs) vs. unfractionated heparin (UFH) in ST-elevation myocardial infarction (STEMI) patients treated (i) with primary percutaneous coronary intervention (pPCI) or (ii) with PCI after thrombolysis. Methods: All-cause mortality was the prespecified primary endpoint and major bleeding complications were recorded as the secondary endpoints. Relative risk (RR) with a 95%confidence interval (CI) and absolute risk reduction (ARR) were chosen as the effect measure. Results: Ten studies comprising 16 286 patients were included. The median followup was 2 months for the primary endpoint. Among LMWHs, enoxaparin was the compound most frequently used. In the pPCI group, LMWHs were associated with a reduction in mortality [RR (95% CI) = 0.51 (0.41–0.64), P < 0.001, ARR = 3%] and major bleeding [RR (95% CI) = 0.68 (0.49–0.94), P = 0.02, ARR = 2.0%] as compared with UFH. Conversely, no clear evidence of benefits with LWMHs was observed in the PCI group after thrombolysis. Metaregression showed that patients with a higher baseline risk had greater benefits from LMWHs (r = 0.72, P = 0.02). Conclusions: LMWHs were associated with greater efficacy and safety than UFH in STEMI patients treated with pPCI, with a significant relationship between risk profile and clinical benefits. Based on this meta-analysis, LMWHs may be considered as a preferred anticoagulant among STEMI patients undergoing pPCI

    The Pan-STARRS Moving Object Processing System

    Full text link
    We describe the Pan-STARRS Moving Object Processing System (MOPS), a modern software package that produces automatic asteroid discoveries and identifications from catalogs of transient detections from next-generation astronomical survey telescopes. MOPS achieves > 99.5% efficiency in producing orbits from a synthetic but realistic population of asteroids whose measurements were simulated for a Pan-STARRS4-class telescope. Additionally, using a non-physical grid population, we demonstrate that MOPS can detect populations of currently unknown objects such as interstellar asteroids. MOPS has been adapted successfully to the prototype Pan-STARRS1 telescope despite differences in expected false detection rates, fill-factor loss and relatively sparse observing cadence compared to a hypothetical Pan-STARRS4 telescope and survey. MOPS remains >99.5% efficient at detecting objects on a single night but drops to 80% efficiency at producing orbits for objects detected on multiple nights. This loss is primarily due to configurable MOPS processing limits that are not yet tuned for the Pan-STARRS1 mission. The core MOPS software package is the product of more than 15 person-years of software development and incorporates countless additional years of effort in third-party software to perform lower-level functions such as spatial searching or orbit determination. We describe the high-level design of MOPS and essential subcomponents, the suitability of MOPS for other survey programs, and suggest a road map for future MOPS development.Comment: 57 Pages, 26 Figures, 13 Table

    Demonstrating a long-coherence dual-rail erasure qubit using tunable transmons

    Full text link
    Quantum error correction with erasure qubits promises significant advantages over standard error correction due to favorable thresholds for erasure errors. To realize this advantage in practice requires a qubit for which nearly all errors are such erasure errors, and the ability to check for erasure errors without dephasing the qubit. We experimentally demonstrate that a "dual-rail qubit" consisting of a pair of resonantly-coupled transmons can form a highly coherent erasure qubit, where the erasure error rate is given by the transmon T1T_1 but for which residual dephasing is strongly suppressed, leading to millisecond-scale coherence within the qubit subspace. We show that single-qubit gates are limited primarily by erasure errors, with erasure probability perasure=2.19(2)×10−3p_\text{erasure} = 2.19(2)\times 10^{-3} per gate while the residual errors are ∼40\sim 40 times lower. We further demonstrate mid-circuit detection of erasure errors while introducing <0.1%< 0.1\% dephasing error per check. Finally, we show that the suppression of transmon noise allows this dual-rail qubit to preserve high coherence over a broad tunable operating range, offering an improved capacity to avoid frequency collisions. This work establishes transmon-based dual-rail qubits as an attractive building block for hardware-efficient quantum error correction.Comment: 8+12 pages, 16 figure

    Computing Covers under Substring Consistent Equivalence Relations

    Full text link
    Covers are a kind of quasiperiodicity in strings. A string CC is a cover of another string TT if any position of TT is inside some occurrence of CC in TT. The shortest and longest cover arrays of TT have the lengths of the shortest and longest covers of each prefix of TT, respectively. The literature has proposed linear-time algorithms computing longest and shortest cover arrays taking border arrays as input. An equivalence relation ≈\approx over strings is called a substring consistent equivalence relation (SCER) iff X≈YX \approx Y implies (1) ∣X∣=∣Y∣|X| = |Y| and (2) X[i:j]≈Y[i:j]X[i:j] \approx Y[i:j] for all 1≤i≤j≤∣X∣1 \le i \le j \le |X|. In this paper, we generalize the notion of covers for SCERs and prove that existing algorithms to compute the shortest cover array and the longest cover array of a string TT under the identity relation will work for any SCERs taking the accordingly generalized border arrays.Comment: 16 page
    • …
    corecore