1,209 research outputs found
Quantum Bootstrap Aggregation
We set out a strategy for quantizing attribute bootstrap aggregation to enable variance-resilient quantum machine learning. To do so, we utilise the linear decomposability of decision boundary parameters in the Rebentrost et al. Support Vector Machine to guarantee that stochastic measurement of the output quantum state will give rise to an ensemble decision without destroying the superposition over projective feature subsets induced within the chosen SVM implementation. We achieve a linear performance advantage, O(d), in addition to the existing O(log(n)) advantages of quantization as applied to Support Vector Machines. The approach extends to any form of quantum learning giving rise to linear decision boundaries
Towards a large-scale quantum simulator on diamond surface at room temperature
Strongly-correlated quantum many-body systems exhibits a variety of exotic
phases with long-range quantum correlations, such as spin liquids and
supersolids. Despite the rapid increase in computational power of modern
computers, the numerical simulation of these complex systems becomes
intractable even for a few dozens of particles. Feynman's idea of quantum
simulators offers an innovative way to bypass this computational barrier.
However, the proposed realizations of such devices either require very low
temperatures (ultracold gases in optical lattices, trapped ions,
superconducting devices) and considerable technological effort, or are
extremely hard to scale in practice (NMR, linear optics). In this work, we
propose a new architecture for a scalable quantum simulator that can operate at
room temperature. It consists of strongly-interacting nuclear spins attached to
the diamond surface by its direct chemical treatment, or by means of a
functionalized graphene sheet. The initialization, control and read-out of this
quantum simulator can be accomplished with nitrogen-vacancy centers implanted
in diamond. The system can be engineered to simulate a wide variety of
interesting strongly-correlated models with long-range dipole-dipole
interactions. Due to the superior coherence time of nuclear spins and
nitrogen-vacancy centers in diamond, our proposal offers new opportunities
towards large-scale quantum simulation at room temperatures
Multiscale photosynthetic exciton transfer
Photosynthetic light harvesting provides a natural blueprint for
bioengineered and biomimetic solar energy and light detection technologies.
Recent evidence suggests some individual light harvesting protein complexes
(LHCs) and LHC subunits efficiently transfer excitons towards chemical reaction
centers (RCs) via an interplay between excitonic quantum coherence, resonant
protein vibrations, and thermal decoherence. The role of coherence in vivo is
unclear however, where excitons are transferred through multi-LHC/RC aggregates
over distances typically large compared with intra-LHC scales. Here we assess
the possibility of long-range coherent transfer in a simple chromophore network
with disordered site and transfer coupling energies. Through renormalization we
find that, surprisingly, decoherence is diminished at larger scales, and
long-range coherence is facilitated by chromophoric clustering. Conversely,
static disorder in the site energies grows with length scale, forcing
localization. Our results suggest sustained coherent exciton transfer may be
possible over distances large compared with nearest-neighbour (n-n) chromophore
separations, at physiological temperatures, in a clustered network with small
static disorder. This may support findings suggesting long-range coherence in
algal chloroplasts, and provides a framework for engineering large chromophore
or quantum dot high-temperature exciton transfer networks.Comment: 9 pages, 6 figures. A significantly updated version is now published
online by Nature Physics (2012
The Road to Quantum Computational Supremacy
We present an idiosyncratic view of the race for quantum computational
supremacy. Google's approach and IBM challenge are examined. An unexpected
side-effect of the race is the significant progress in designing fast classical
algorithms. Quantum supremacy, if achieved, won't make classical computing
obsolete.Comment: 15 pages, 1 figur
A Measurement of Rb using a Double Tagging Method
The fraction of Z to bbbar events in hadronic Z decays has been measured by
the OPAL experiment using the data collected at LEP between 1992 and 1995. The
Z to bbbar decays were tagged using displaced secondary vertices, and high
momentum electrons and muons. Systematic uncertainties were reduced by
measuring the b-tagging efficiency using a double tagging technique. Efficiency
correlations between opposite hemispheres of an event are small, and are well
understood through comparisons between real and simulated data samples. A value
of Rb = 0.2178 +- 0.0011 +- 0.0013 was obtained, where the first error is
statistical and the second systematic. The uncertainty on Rc, the fraction of Z
to ccbar events in hadronic Z decays, is not included in the errors. The
dependence on Rc is Delta(Rb)/Rb = -0.056*Delta(Rc)/Rc where Delta(Rc) is the
deviation of Rc from the value 0.172 predicted by the Standard Model. The
result for Rb agrees with the value of 0.2155 +- 0.0003 predicted by the
Standard Model.Comment: 42 pages, LaTeX, 14 eps figures included, submitted to European
Physical Journal
Measurement of the B+ and B-0 lifetimes and search for CP(T) violation using reconstructed secondary vertices
The lifetimes of the B+ and B-0 mesons, and their ratio, have been measured in the OPAL experiment using 2.4 million hadronic Z(0) decays recorded at LEP. Z(0) --> b (b) over bar decays were tagged using displaced secondary vertices and high momentum electrons and muons. The lifetimes were then measured using well-reconstructed charged and neutral secondary vertices selected in this tagged data sample. The results aretau(B+) = 1.643 +/- 0.037 +/- 0.025 pstau(Bo) = 1.523 +/- 0.057 +/- 0.053 pstau(B+)/tau(Bo) = 1.079 +/- 0.064 +/- 0.041,where in each case the first error is statistical and the second systematic.A larger data sample of 3.1 million hadronic Z(o) decays has been used to search for CP and CPT violating effects by comparison of inclusive b and (b) over bar hadron decays, No evidence fur such effects is seen. The CP violation parameter Re(epsilon(B)) is measured to be Re(epsilon(B)) = 0.001 +/- 0.014 +/- 0.003and the fractional difference between b and (b) over bar hadron lifetimes is measured to(Delta tau/tau)(b) = tau(b hadron) - tau((b) over bar hadron)/tau(average) = -0.001 +/- 0.012 +/- 0.008
Measurement of the inclusive and dijet cross-sections of b-jets in pp collisions at sqrt(s) = 7 TeV with the ATLAS detector
The inclusive and dijet production cross-sections have been measured for jets
containing b-hadrons (b-jets) in proton-proton collisions at a centre-of-mass
energy of sqrt(s) = 7 TeV, using the ATLAS detector at the LHC. The
measurements use data corresponding to an integrated luminosity of 34 pb^-1.
The b-jets are identified using either a lifetime-based method, where secondary
decay vertices of b-hadrons in jets are reconstructed using information from
the tracking detectors, or a muon-based method where the presence of a muon is
used to identify semileptonic decays of b-hadrons inside jets. The inclusive
b-jet cross-section is measured as a function of transverse momentum in the
range 20 < pT < 400 GeV and rapidity in the range |y| < 2.1. The bbbar-dijet
cross-section is measured as a function of the dijet invariant mass in the
range 110 < m_jj < 760 GeV, the azimuthal angle difference between the two jets
and the angular variable chi in two dijet mass regions. The results are
compared with next-to-leading-order QCD predictions. Good agreement is observed
between the measured cross-sections and the predictions obtained using POWHEG +
Pythia. MC@NLO + Herwig shows good agreement with the measured bbbar-dijet
cross-section. However, it does not reproduce the measured inclusive
cross-section well, particularly for central b-jets with large transverse
momenta.Comment: 10 pages plus author list (21 pages total), 8 figures, 1 table, final
version published in European Physical Journal
A randomised, double-blind, placebo-controlled trial of repeated nebulisation of non-viral cystic fibrosis transmembrane conductance regulator (CFTR) gene therapy in patients with cystic fibrosis
BACKGROUND: Cystic fibrosis (CF) is a chronic, life-limiting disease caused by mutations in the CF
transmembrane conductance regulator (CFTR) gene leading to abnormal airway surface ion transport,
chronic lung infections, inflammation and eventual respiratory failure. With the exception of the
small-molecule potentiator, ivacaftor (Kalydeco®, Vertex Pharmaceuticals, Boston, MA, USA), which is
suitable for a small proportion of patients, there are no licensed therapies targeting the basic defect.
The UK Cystic Fibrosis Gene Therapy Consortium has taken a cationic lipid-mediated CFTR gene therapy
formulation through preclinical and clinical development.
OBJECTIVE: To determine clinical efficacy of the formulation delivered to the airways over a period of
1 year in patients with CF.
DESIGN: This was a randomised, double-blind, placebo-controlled Phase IIb trial of the CFTR gene–liposome
complex pGM169/GL67A. Randomisation was performed via InForm™ version 4.6 (Phase Forward
Incorporated, Oracle, CA, USA) and was 1 : 1, except for patients in the mechanistic subgroups (2 : 1).
Allocation was blinded by masking nebuliser chambers.
SETTINGS: Data were collected in the clinical and scientific sites and entered onto a trial-specific InForm,
version 4.6 database.
PARTICIPANTS: Patients with CF aged ≥ 12 years with forced expiratory volume in the first second (FEV1)
between 50% and 90% predicted and any combination of CFTR mutations. The per-protocol group
(≥ 9 doses) consisted of 54 patients receiving placebo (62 randomised) and 62 patients receiving gene
therapy (78 randomised).
INTERVENTIONS: Subjects received 5 ml of nebulised pGM169/G67A (active) or 0.9% saline (placebo) at
28 (±5)-day intervals over 1 year.
MAIN OUTCOME MEASURES: The primary end point was the relative change in percentage predicted FEV1
over the 12-month period. A number of secondary clinical outcomes were assessed alongside safety
measures: other spirometric values; lung clearance index (LCI) assessed by multibreath washout; structural
disease on computed tomography (CT) scan; the Cystic Fibrosis Questionnaire – Revised (CFQ-R), a
validated quality-of-life questionnaire; exercise capacity and monitoring; systemic and sputum inflammatory
markers; and adverse events (AEs). A mechanistic study was performed in a subgroup in whom transgene
deoxyribonucleic acid (DNA) and messenger ribonucleic acid (mRNA) was measured alongside nasal and
lower airway potential difference.
RESULTS: There was a significant (p = 0.046) treatment effect (TE) of 3.7% [95% confidence interval (CI)
0.1% to 7.3%] in the primary end point at 12 months and in secondary end points, including forced vital
capacity (FVC) (p = 0.031) and CT gas trapping (p = 0.048). Other outcomes, although not reaching
statistical significance, favoured active treatment. Effects were noted by 1 month and were irrespective
of sex, age or CFTR mutation class. Subjects with a more severe baseline FEV1 had a FEV1 TE of 6.4%
(95% CI 0.8% to 12.1%) and greater changes in many other secondary outcomes. However, the more
mildly affected group also demonstrated benefits, particularly in small airway disease markers such as LCI.
The active group showed a significantly (p = 0.032) greater bronchial chloride secretory response. No
difference in treatment-attributable AEs was seen between the placebo and active groups.
CONCLUSIONS: Monthly application of the pGM169/GL67A gene therapy formulation was associated with
an improvement in lung function, other clinically relevant parameters and bronchial CFTR function,
compared with placebo.
LIMITATIONS: Although encouraging, the improvement in FEV1 was modest and was not accompanied by
detectable improvement in patients’ quality of life.
FUTURE WORK: Future work will focus on attempts to increase efficacy by increasing dose or frequency,
the coadministration of a CFTR potentiator, or the use of modified viral vectors capable of
repeated administration.
TRIAL REGISTRATION: ClinicalTrials.gov NCT01621867
Synthesising practice guidelines for the development of community-based exercise programmes after stroke
This is a freely-available open access publication. Please cite the published version which is available via the DOI link in this record.Multiple guidelines are often available to inform practice in complex interventions. Guidance implementation may be facilitated if it is tailored to particular clinical issues and contexts. It should also aim to specify all elements of interventions that may mediate and modify effectiveness, including both their content and delivery. We conducted a focused synthesis of recommendations from stroke practice guidelines to produce a structured and comprehensive account to facilitate the development of community-based exercise programmes after stroke.National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care (CLAHRC) for the South West Peninsul
- …
