14 research outputs found
Measurement-induced entanglement and teleportation on a noisy quantum processor
Measurement has a special role in quantum theory: by collapsing the
wavefunction it can enable phenomena such as teleportation and thereby alter
the "arrow of time" that constrains unitary evolution. When integrated in
many-body dynamics, measurements can lead to emergent patterns of quantum
information in space-time that go beyond established paradigms for
characterizing phases, either in or out of equilibrium. On present-day NISQ
processors, the experimental realization of this physics is challenging due to
noise, hardware limitations, and the stochastic nature of quantum measurement.
Here we address each of these experimental challenges and investigate
measurement-induced quantum information phases on up to 70 superconducting
qubits. By leveraging the interchangeability of space and time, we use a
duality mapping, to avoid mid-circuit measurement and access different
manifestations of the underlying phases -- from entanglement scaling to
measurement-induced teleportation -- in a unified way. We obtain finite-size
signatures of a phase transition with a decoding protocol that correlates the
experimental measurement record with classical simulation data. The phases
display sharply different sensitivity to noise, which we exploit to turn an
inherent hardware limitation into a useful diagnostic. Our work demonstrates an
approach to realize measurement-induced physics at scales that are at the
limits of current NISQ processors
Non-Abelian braiding of graph vertices in a superconducting processor
Indistinguishability of particles is a fundamental principle of quantum
mechanics. For all elementary and quasiparticles observed to date - including
fermions, bosons, and Abelian anyons - this principle guarantees that the
braiding of identical particles leaves the system unchanged. However, in two
spatial dimensions, an intriguing possibility exists: braiding of non-Abelian
anyons causes rotations in a space of topologically degenerate wavefunctions.
Hence, it can change the observables of the system without violating the
principle of indistinguishability. Despite the well developed mathematical
description of non-Abelian anyons and numerous theoretical proposals, the
experimental observation of their exchange statistics has remained elusive for
decades. Controllable many-body quantum states generated on quantum processors
offer another path for exploring these fundamental phenomena. While efforts on
conventional solid-state platforms typically involve Hamiltonian dynamics of
quasi-particles, superconducting quantum processors allow for directly
manipulating the many-body wavefunction via unitary gates. Building on
predictions that stabilizer codes can host projective non-Abelian Ising anyons,
we implement a generalized stabilizer code and unitary protocol to create and
braid them. This allows us to experimentally verify the fusion rules of the
anyons and braid them to realize their statistics. We then study the prospect
of employing the anyons for quantum computation and utilize braiding to create
an entangled state of anyons encoding three logical qubits. Our work provides
new insights about non-Abelian braiding and - through the future inclusion of
error correction to achieve topological protection - could open a path toward
fault-tolerant quantum computing
Suppressing quantum errors by scaling a surface code logical qubit
Practical quantum computing will require error rates that are well below what
is achievable with physical qubits. Quantum error correction offers a path to
algorithmically-relevant error rates by encoding logical qubits within many
physical qubits, where increasing the number of physical qubits enhances
protection against physical errors. However, introducing more qubits also
increases the number of error sources, so the density of errors must be
sufficiently low in order for logical performance to improve with increasing
code size. Here, we report the measurement of logical qubit performance scaling
across multiple code sizes, and demonstrate that our system of superconducting
qubits has sufficient performance to overcome the additional errors from
increasing qubit number. We find our distance-5 surface code logical qubit
modestly outperforms an ensemble of distance-3 logical qubits on average, both
in terms of logical error probability over 25 cycles and logical error per
cycle ( compared to ). To investigate
damaging, low-probability error sources, we run a distance-25 repetition code
and observe a logical error per round floor set by a single
high-energy event ( when excluding this event). We are able
to accurately model our experiment, and from this model we can extract error
budgets that highlight the biggest challenges for future systems. These results
mark the first experimental demonstration where quantum error correction begins
to improve performance with increasing qubit number, illuminating the path to
reaching the logical error rates required for computation.Comment: Main text: 6 pages, 4 figures. v2: Update author list, references,
Fig. S12, Table I
Increasing frailty is associated with higher prevalence and reduced recognition of delirium in older hospitalised inpatients: results of a multi-centre study
Purpose:
Delirium is a neuropsychiatric disorder delineated by an acute change in cognition, attention, and consciousness. It is common, particularly in older adults, but poorly recognised. Frailty is the accumulation of deficits conferring an increased risk of adverse outcomes. We set out to determine how severity of frailty, as measured using the CFS, affected delirium rates, and recognition in hospitalised older people in the United Kingdom.
Methods:
Adults over 65 years were included in an observational multi-centre audit across UK hospitals, two prospective rounds, and one retrospective note review. Clinical Frailty Scale (CFS), delirium status, and 30-day outcomes were recorded.
Results:
The overall prevalence of delirium was 16.3% (483). Patients with delirium were more frail than patients without delirium (median CFS 6 vs 4). The risk of delirium was greater with increasing frailty [OR 2.9 (1.8–4.6) in CFS 4 vs 1–3; OR 12.4 (6.2–24.5) in CFS 8 vs 1–3]. Higher CFS was associated with reduced recognition of delirium (OR of 0.7 (0.3–1.9) in CFS 4 compared to 0.2 (0.1–0.7) in CFS 8). These risks were both independent of age and dementia.
Conclusion:
We have demonstrated an incremental increase in risk of delirium with increasing frailty. This has important clinical implications, suggesting that frailty may provide a more nuanced measure of vulnerability to delirium and poor outcomes. However, the most frail patients are least likely to have their delirium diagnosed and there is a significant lack of research into the underlying pathophysiology of both of these common geriatric syndromes
A rapid methodology for the characterization of dialkyl tertiary amine-n-oxide metabolites using structurally dependent dissociation pathways and reconstructed ion current chromatograms
A high-performance liquid chromatography-electrosprayionization-tandem mass spectrometry (HPLC-ESI-MS/MS)approach to the characterization of dialkyl tertiary amine-N-oxides is presented. The methodology is based upon forming reconstructed ion current chromatograms (RICCs)of m/z values of product ions known to form through diagnostic losses from dialkyl tertiary amine-N-oxides. Thediagnostic losses of N,N-dimethylhydroxylamine and N,Ndiethylhydroxylamine were identified through the analysis of a structurally diverse library of compounds by ESI-lowenergy collision-induced dissociation (CID)-MS/MS using quadrupole ion trap-mass spectrometry (QIT-MS) and quadrupole time-of-flight-mass spectrometry (QqTOF-MS). Thelibrary consisted of dialkyl tertiary amine-containing ommercially available pharmaceuticals, along with a numberof model, synthetic N-oxides. The loss of the nitrogen containing group was observed in 89% of the low-energy CIDproduct ion spectra acquired using various collision energies.Further, the resultant product ions, formed through the loss of the nitrogen-containing group, were shown to beunstable because of the observation of second-generationdissociation. These observations regarding gas-phase ionchemistry could be useful to developers of in silico programs for fragmentation prediction by allowing the creation of improved algorithms and models for predicting dissociation.Using the information derived from the library analysis, the characterization methodology was developed and demonstrated using tetracaine. The approach is rapid, MS/MSplatform independent, utilizes existing technology, and could be automated. Further, it is definitive and overcomes the limitations of other tools for N-oxide identification by localizing the site of oxidation. Thus, it provides a useful addition to the existing approaches for metabolite identification
Towards a Meta Cost-Benefit Analysis: The Case of Brexit
Doubtless Brexit is one of the most important regulatory challenges for an entire country since the beginning of the twenty-first century. Equally important are the implications, in terms of costs and benefits, of this democratic decision for the UK’s economy and for its regulatory environment. So far, some cost-benefit analyses have attempted to measure the post-Brexit situation. Few studies have proposed a ‘meta cost-benefit analysis’, which would encompass current studies into one aggregated study. No study has provided for a meta cost-benefit analysis as the Brexit negotiations unfold and which provides for a comprehensive discussion of the regulatory issues at stake. This article intends to fill this gap. The originality of this article lies in both its content and its timing. The content is original because it discusses the scientific possibility of a meta cost-benefit analysis of Brexit together with the inherent limits associated with such an endeavour. The timing is appropriate as we are in the critical halfway point of the two-year negotiation period (2017-2019) during which the EU and the UK must secure the relevant deals to ensure a smooth and frictionless Brexit for both sides of the Channel
Estimating and explaining the effect of education and income on head and neck cancer risk: INHANCE consortium pooled analysis of 31 case-control studies from 27 countries
Low socioeconomic status has been reported to be associated with head and neck cancer risk. However, previous studies have been too small to examine the associations by cancer subsite, age, sex, global region and calendar time and to explain the association in terms of behavioral risk factors. Individual participant data of 23,964 cases with head and neck cancer and 31,954 controls from 31 studies in 27 countries pooled with random effects models. Overall, low education was associated with an increased risk of head and neck cancer (OR = 2.50; 95% CI = 2.02 – 3.09). Overall one-third of the increased risk was not explained by differences in the distribution of cigarette smoking and alcohol behaviors; and it remained elevated among never users of tobacco and nondrinkers (OR = 1.61; 95% CI = 1.13 – 2.31). More of the estimated education effect was not explained by cigarette smoking and alcohol behaviors: in women than in men, in older than younger groups, in the oropharynx than in other sites, in South/Central America than in Europe/North America and was strongest in countries with greater income inequality. Similar findings were observed for the estimated effect of low versus high household income. The lowest levels of income and educational attainment were associated with more than 2-fold increased risk of head and neck cancer, which is not entirely explained by differences in the distributions of behavioral risk factors for these cancers and which varies across cancer sites, sexes, countries and country income inequality levels