43 research outputs found
Antimicrobial resistance in commensal opportunistic pathogens isolated from non-sterile sites can be an effective proxy for surveillance in bloodstream infections
Antimicrobial resistance (AMR) surveillance in bloodstream infections (BSIs) is challenging in low/middle-income countries (LMICs) given limited laboratory capacity. Other specimens are easier to collect and process and are more likely to be culture-positive. In 8102 E. coli BSIs, 322,087 E. coli urinary tract infections, 6952 S. aureus BSIs and 112,074 S. aureus non-sterile site cultures from Oxfordshire (1998–2018), and other (55,296 isolates) rarer commensal opportunistic pathogens, antibiotic resistance trends over time in blood were strongly associated with those in other specimens (maximum cross-correlation per drug 0.51–0.99). Resistance prevalence was congruent across drug-years for each species (276/312 (88%) species-drug-years with prevalence within ± 10% between blood/other isolates). Results were similar across multiple countries in high/middle/low income-settings in the independent ATLAS dataset (103,559 isolates, 2004–2017) and three further LMIC hospitals/programmes (6154 isolates, 2008–2019). AMR in commensal opportunistic pathogens cultured from BSIs is strongly associated with AMR in commensal opportunistic pathogens cultured from non-sterile sites over calendar time, suggesting the latter could be used as an effective proxy for AMR surveillance in BSIs
Recommended from our members
Bayesian approach to determining penetrance of pathogenic SDH variants.
BACKGROUND: Until recently, determining penetrance required large observational cohort studies. Data from the Exome Aggregate Consortium (ExAC) allows a Bayesian approach to calculate penetrance, in that population frequencies of pathogenic germline variants should be inversely proportional to their penetrance for disease. We tested this hypothesis using data from two cohorts for succinate dehydrogenase subunits A, B and C (SDHA-C) genetic variants associated with hereditary pheochromocytoma/paraganglioma (PC/PGL). METHODS: Two cohorts were 575 unrelated Australian subjects and 1240 unrelated UK subjects, respectively, with PC/PGL in whom genetic testing had been performed. Penetrance of pathogenic SDHA-C variants was calculated by comparing allelic frequencies in cases versus controls from ExAC (removing those variants contributed by The Cancer Genome Atlas). RESULTS: Pathogenic SDHA-C variants were identified in 106 subjects (18.4%) in cohort 1 and 317 subjects (25.6%) in cohort 2. Of 94 different pathogenic variants from both cohorts (seven in SDHA, 75 in SDHB and 12 in SDHC), 13 are reported in ExAC (two in SDHA, nine in SDHB and two in SDHC) accounting for 21% of subjects with SDHA-C variants. Combining data from both cohorts, estimated lifetime disease penetrance was 22.0% (95% CI 15.2% to 30.9%) for SDHB variants, 8.3% (95% CI 3.5% to 18.5%) for SDHC variants and 1.7% (95% CI 0.8% to 3.8%) for SDHA variants. CONCLUSION: Pathogenic variants in SDHB are more penetrant than those in SDHC and SDHA. Our findings have important implications for counselling and surveillance of subjects carrying these pathogenic variants
Development and validation of a targeted gene sequencing panel for application to disparate cancers
Next generation sequencing has revolutionised genomic studies of cancer, having facilitated the development of precision oncology treatments based on a tumour’s molecular profile. We aimed to develop a targeted gene sequencing panel for application to disparate cancer types with particular focus on tumours of the head and neck, plus test for utility in liquid biopsy. The final panel designed through Roche/Nimblegen combined 451 cancer-associated genes (2.01 Mb target region). 136 patient DNA samples were collected for performance and application testing. Panel sensitivity and precision were measured using well-characterised DNA controls (n = 47), and specificity by Sanger sequencing of the Aryl Hydrocarbon Receptor Interacting Protein (AIP) gene in 89 patients. Assessment of liquid biopsy application employed a pool of synthetic circulating tumour DNA (ctDNA). Library preparation and sequencing were conducted on Illumina-based platforms prior to analysis with our accredited (ISO15189) bioinformatics pipeline. We achieved a mean coverage of 395x, with sensitivity and specificity of >99% and precision of >97%. Liquid biopsy revealed detection to 1.25% variant allele frequency. Application to head and neck tumours/cancers resulted in detection of mutations aligned to published databases. In conclusion, we have developed an analytically-validated panel for application to cancers of disparate types with utility in liquid biopsy
Readout of a quantum processor with high dynamic range Josephson parametric amplifiers
We demonstrate a high dynamic range Josephson parametric amplifier (JPA) in
which the active nonlinear element is implemented using an array of rf-SQUIDs.
The device is matched to the 50 environment with a Klopfenstein-taper
impedance transformer and achieves a bandwidth of 250-300 MHz, with input
saturation powers up to -95 dBm at 20 dB gain. A 54-qubit Sycamore processor
was used to benchmark these devices, providing a calibration for readout power,
an estimate of amplifier added noise, and a platform for comparison against
standard impedance matched parametric amplifiers with a single dc-SQUID. We
find that the high power rf-SQUID array design has no adverse effect on system
noise, readout fidelity, or qubit dephasing, and we estimate an upper bound on
amplifier added noise at 1.6 times the quantum limit. Lastly, amplifiers with
this design show no degradation in readout fidelity due to gain compression,
which can occur in multi-tone multiplexed readout with traditional JPAs.Comment: 9 pages, 8 figure
Measurement-Induced State Transitions in a Superconducting Qubit: Within the Rotating Wave Approximation
Superconducting qubits typically use a dispersive readout scheme, where a
resonator is coupled to a qubit such that its frequency is qubit-state
dependent. Measurement is performed by driving the resonator, where the
transmitted resonator field yields information about the resonator frequency
and thus the qubit state. Ideally, we could use arbitrarily strong resonator
drives to achieve a target signal-to-noise ratio in the shortest possible time.
However, experiments have shown that when the average resonator photon number
exceeds a certain threshold, the qubit is excited out of its computational
subspace, which we refer to as a measurement-induced state transition. These
transitions degrade readout fidelity, and constitute leakage which precludes
further operation of the qubit in, for example, error correction. Here we study
these transitions using a transmon qubit by experimentally measuring their
dependence on qubit frequency, average photon number, and qubit state, in the
regime where the resonator frequency is lower than the qubit frequency. We
observe signatures of resonant transitions between levels in the coupled
qubit-resonator system that exhibit noisy behavior when measured repeatedly in
time. We provide a semi-classical model of these transitions based on the
rotating wave approximation and use it to predict the onset of state
transitions in our experiments. Our results suggest the transmon is excited to
levels near the top of its cosine potential following a state transition, where
the charge dispersion of higher transmon levels explains the observed noisy
behavior of state transitions. Moreover, occupation in these higher energy
levels poses a major challenge for fast qubit reset
Overcoming leakage in scalable quantum error correction
Leakage of quantum information out of computational states into higher energy
states represents a major challenge in the pursuit of quantum error correction
(QEC). In a QEC circuit, leakage builds over time and spreads through
multi-qubit interactions. This leads to correlated errors that degrade the
exponential suppression of logical error with scale, challenging the
feasibility of QEC as a path towards fault-tolerant quantum computation. Here,
we demonstrate the execution of a distance-3 surface code and distance-21
bit-flip code on a Sycamore quantum processor where leakage is removed from all
qubits in each cycle. This shortens the lifetime of leakage and curtails its
ability to spread and induce correlated errors. We report a ten-fold reduction
in steady-state leakage population on the data qubits encoding the logical
state and an average leakage population of less than
throughout the entire device. The leakage removal process itself efficiently
returns leakage population back to the computational basis, and adding it to a
code circuit prevents leakage from inducing correlated error across cycles,
restoring a fundamental assumption of QEC. With this demonstration that leakage
can be contained, we resolve a key challenge for practical QEC at scale.Comment: Main text: 7 pages, 5 figure
Effects of ocean sprawl on ecological connectivity: impacts and solutions
The growing number of artificial structures in estuarine, coastal and marine environments is causing “ocean sprawl”. Artificial structures do not only modify marine and coastal ecosystems at the sites of their placement, but may also produce larger-scale impacts through their alteration of ecological connectivity - the movement of organisms, materials and energy between habitat units within seascapes. Despite the growing awareness of the capacity of ocean sprawl to influence ecological connectivity, we lack a comprehensive understanding of how artificial structures modify ecological connectivity in near- and off-shore environments, and when and where their effects on connectivity are greatest. We review the mechanisms by which ocean sprawl may modify ecological connectivity, including trophic connectivity associated with the flow of nutrients and resources. We also review demonstrated, inferred and likely ecological impacts of such changes to connectivity, at scales from genes to ecosystems, and potential strategies of management for mitigating these effects. Ocean sprawl may alter connectivity by: (1) creating barriers to the movement of some organisms and resources - by adding physical barriers or by modifying and fragmenting habitats; (2) introducing new structural material that acts as a conduit for the movement of other organisms or resources across the landscape; and (3) altering trophic connectivity. Changes to connectivity may, in turn, influence the genetic structure and size of populations, the distribution of species, and community structure and ecological functioning. Two main approaches to the assessment of ecological connectivity have been taken: (1) measurement of structural connectivity - the configuration of the landscape and habitat patches and their dynamics; and (2) measurement of functional connectivity - the response of organisms or particles to the landscape. Our review reveals the paucity of studies directly addressing the effects of artificial structures on ecological connectivity in the marine environment, particularly at large spatial and temporal scales. With the ongoing development of estuarine and marine environments, there is a pressing need for additional studies that quantify the effects of ocean sprawl on ecological connectivity. Understanding the mechanisms by which structures modify connectivity is essential if marine spatial planning and eco-engineering are to be effectively utilised to minimise impacts
Suppressing quantum errors by scaling a surface code logical qubit
Practical quantum computing will require error rates that are well below what
is achievable with physical qubits. Quantum error correction offers a path to
algorithmically-relevant error rates by encoding logical qubits within many
physical qubits, where increasing the number of physical qubits enhances
protection against physical errors. However, introducing more qubits also
increases the number of error sources, so the density of errors must be
sufficiently low in order for logical performance to improve with increasing
code size. Here, we report the measurement of logical qubit performance scaling
across multiple code sizes, and demonstrate that our system of superconducting
qubits has sufficient performance to overcome the additional errors from
increasing qubit number. We find our distance-5 surface code logical qubit
modestly outperforms an ensemble of distance-3 logical qubits on average, both
in terms of logical error probability over 25 cycles and logical error per
cycle ( compared to ). To investigate
damaging, low-probability error sources, we run a distance-25 repetition code
and observe a logical error per round floor set by a single
high-energy event ( when excluding this event). We are able
to accurately model our experiment, and from this model we can extract error
budgets that highlight the biggest challenges for future systems. These results
mark the first experimental demonstration where quantum error correction begins
to improve performance with increasing qubit number, illuminating the path to
reaching the logical error rates required for computation.Comment: Main text: 6 pages, 4 figures. v2: Update author list, references,
Fig. S12, Table I
Measurement-induced entanglement and teleportation on a noisy quantum processor
Measurement has a special role in quantum theory: by collapsing the
wavefunction it can enable phenomena such as teleportation and thereby alter
the "arrow of time" that constrains unitary evolution. When integrated in
many-body dynamics, measurements can lead to emergent patterns of quantum
information in space-time that go beyond established paradigms for
characterizing phases, either in or out of equilibrium. On present-day NISQ
processors, the experimental realization of this physics is challenging due to
noise, hardware limitations, and the stochastic nature of quantum measurement.
Here we address each of these experimental challenges and investigate
measurement-induced quantum information phases on up to 70 superconducting
qubits. By leveraging the interchangeability of space and time, we use a
duality mapping, to avoid mid-circuit measurement and access different
manifestations of the underlying phases -- from entanglement scaling to
measurement-induced teleportation -- in a unified way. We obtain finite-size
signatures of a phase transition with a decoding protocol that correlates the
experimental measurement record with classical simulation data. The phases
display sharply different sensitivity to noise, which we exploit to turn an
inherent hardware limitation into a useful diagnostic. Our work demonstrates an
approach to realize measurement-induced physics at scales that are at the
limits of current NISQ processors
Non-Abelian braiding of graph vertices in a superconducting processor
Indistinguishability of particles is a fundamental principle of quantum
mechanics. For all elementary and quasiparticles observed to date - including
fermions, bosons, and Abelian anyons - this principle guarantees that the
braiding of identical particles leaves the system unchanged. However, in two
spatial dimensions, an intriguing possibility exists: braiding of non-Abelian
anyons causes rotations in a space of topologically degenerate wavefunctions.
Hence, it can change the observables of the system without violating the
principle of indistinguishability. Despite the well developed mathematical
description of non-Abelian anyons and numerous theoretical proposals, the
experimental observation of their exchange statistics has remained elusive for
decades. Controllable many-body quantum states generated on quantum processors
offer another path for exploring these fundamental phenomena. While efforts on
conventional solid-state platforms typically involve Hamiltonian dynamics of
quasi-particles, superconducting quantum processors allow for directly
manipulating the many-body wavefunction via unitary gates. Building on
predictions that stabilizer codes can host projective non-Abelian Ising anyons,
we implement a generalized stabilizer code and unitary protocol to create and
braid them. This allows us to experimentally verify the fusion rules of the
anyons and braid them to realize their statistics. We then study the prospect
of employing the anyons for quantum computation and utilize braiding to create
an entangled state of anyons encoding three logical qubits. Our work provides
new insights about non-Abelian braiding and - through the future inclusion of
error correction to achieve topological protection - could open a path toward
fault-tolerant quantum computing