6,251 research outputs found
Scalable Noise Estimation with Random Unitary Operators
We describe a scalable stochastic method for the experimental measurement of
generalized fidelities characterizing the accuracy of the implementation of a
coherent quantum transformation. The method is based on the motion reversal of
random unitary operators. In the simplest case our method enables direct
estimation of the average gate fidelity. The more general fidelities are
characterized by a universal exponential rate of fidelity loss. In all cases
the measurable fidelity decrease is directly related to the strength of the
noise affecting the implementation -- quantified by the trace of the
superoperator describing the non--unitary dynamics. While the scalability of
our stochastic protocol makes it most relevant in large Hilbert spaces (when
quantum process tomography is infeasible), our method should be immediately
useful for evaluating the degree of control that is achievable in any prototype
quantum processing device. By varying over different experimental arrangements
and error-correction strategies additional information about the noise can be
determined.Comment: 8 pages; v2: published version (typos corrected; reference added
Sinuosity and the affect grid: A method for adjusting repeated mood scores
Copyright @ 2012 Ammons Scientific. The article can be accessed from the links below.This article has been made available through the Brunel Open Access Publishing Fund.Sinuosity is a measure of how much a travelled pathway deviates from a straight line. In this paper, sinuosity is applied to the measurement of mood. The Affect Grid is a mood scale that requires participants to place a mark on a 9 x 9 grid to indicate their current mood. The grid has two dimensions: pleasure-displeasure (horizontal) and arousal-sleepiness (vertical). In studies where repeated measurements are required, some participants may exaggerate their mood shifts due to faulty interpretation of the scale or a feeling of social obligation to the experimenter. A new equation is proposed, based on the sinuosity measure in hydrology, a measure of the meandering of rivers. The equation takes into account an individual's presumed tendency to exaggerate and meander to correct the score and reduce outliers. The usefulness of the equation is demonstrated by applying it to Affect Grid data from another study.This article is made available through the Brunel Open Access Publishing Fund
Randomized benchmarking of single and multi-qubit control in liquid-state NMR quantum information processing
Being able to quantify the level of coherent control in a proposed device
implementing a quantum information processor (QIP) is an important task for
both comparing different devices and assessing a device's prospects with
regards to achieving fault-tolerant quantum control. We implement in a
liquid-state nuclear magnetic resonance QIP the randomized benchmarking
protocol presented by Knill et al (PRA 77: 012307 (2008)). We report an error
per randomized pulse of with a
single qubit QIP and show an experimentally relevant error model where the
randomized benchmarking gives a signature fidelity decay which is not possible
to interpret as a single error per gate. We explore and experimentally
investigate multi-qubit extensions of this protocol and report an average error
rate for one and two qubit gates of for a three
qubit QIP. We estimate that these error rates are still not decoherence limited
and thus can be improved with modifications to the control hardware and
software.Comment: 10 pages, 6 figures, submitted versio
Modelling thermal flow in a transition regime using a lattice Boltzmann approach
Lattice Boltzmann models are already able to capture important rarefied flow phenomena, such as velocity-slip and temperature jump, provided the effects of the Knudsen layer are minimal. However, both conventional hydrodynamics, as exemplified by the Navier-Stokes-Fourier equations, and the lattice Boltzmann method fail to predict the nonlinear velocity and temperature variations in the Knudsen layer that have been observed in kinetic theory. In the present paper, we propose an extension to the lattice Boltzmann method that will enable the simulation of thermal flows in the transition regime where Knudsen layer effects are significant. A correction function is introduced that accounts for the reduction in the mean free path near a wall. This new approach is compared with direct simulation Monte Carlo data for Fourier flow and good qualitative agreement is obtained for Knudsen numbers up to 1.58
Characterization of complex quantum dynamics with a scalable NMR information processor
We present experimental results on the measurement of fidelity decay under
contrasting system dynamics using a nuclear magnetic resonance quantum
information processor. The measurements were performed by implementing a
scalable circuit in the model of deterministic quantum computation with only
one quantum bit. The results show measurable differences between regular and
complex behaviour and for complex dynamics are faithful to the expected
theoretical decay rate. Moreover, we illustrate how the experimental method can
be seen as an efficient way for either extracting coarse-grained information
about the dynamics of a large system, or measuring the decoherence rate from
engineered environments.Comment: 4pages, 3 figures, revtex4, updated with version closer to that
publishe
Symmetrised Characterisation of Noisy Quantum Processes
A major goal of developing high-precision control of many-body quantum
systems is to realise their potential as quantum computers. Probably the most
significant obstacle in this direction is the problem of "decoherence": the
extreme fragility of quantum systems to environmental noise and other control
limitations. The theory of fault-tolerant quantum error correction has shown
that quantum computation is possible even in the presence of decoherence
provided that the noise affecting the quantum system satisfies certain
well-defined theoretical conditions. However, existing methods for noise
characterisation have become intractable already for the systems that are
controlled in today's labs. In this paper we introduce a technique based on
symmetrisation that enables direct experimental characterisation of key
properties of the decoherence affecting a multi-body quantum system. Our method
reduces the number of experiments required by existing methods from exponential
to polynomial in the number of subsystems. We demonstrate the application of
this technique to the optimisation of control over nuclear spins in the solid
state.Comment: About 12 pages, 5 figure
When images work faster than words: The integration of content-based image retrieval with the Northumbria Watermark Archive
Information on the manufacture, history, provenance, identification, care and conservation of paper-based artwork/objects is disparate and not always readily available. The Northumbria Watermark Archive will incorporate such material into a database, which will be made freely available on the Internet providing an invaluable resource for conservation, research and education. The efficiency of a database is highly dependant on its search mechanism. Text based mechanisms are frequently ineffective when a range of descriptive terminologies might be used i.e. when describing images or translating from foreign languages. In such cases a Content Based Image Retrieval (CBIR) system can be more effective. Watermarks provide paper with unique visual identification characteristics and have been used to provide a point of entry to the archive that is more efficient and effective than a text based search mechanism. The research carried out has the potential to be applied to any numerically large collection of images with distinctive features of colour, shape or texture i.e. coins, architectural features, picture frame profiles, hallmarks, Japanese artists stamps etc. Although the establishment of an electronic archive incorporating a CBIR system can undoubtedly improve access to large collections of images and related data, the development is rarely trouble free. This paper discusses some of the issues that must be considered i.e. collaboration between disciplines; project management; copying and digitising objects; content based image retrieval; the Northumbria Watermark Archive; the use of standardised terminology within a database as well as copyright issues
Peer review and citation data in predicting university rankings, a large-scale analysis
Most Performance-based Research Funding Systems (PRFS) draw on peer review and bibliometric indicators, two different method- ologies which are sometimes combined. A common argument against the use of indicators in such research evaluation exercises is their low corre- lation at the article level with peer review judgments. In this study, we analyse 191,000 papers from 154 higher education institutes which were peer reviewed in a national research evaluation exercise. We combine these data with 6.95 million citations to the original papers. We show that when citation-based indicators are applied at the institutional or departmental level, rather than at the level of individual papers, surpris- ingly large correlations with peer review judgments can be observed, up to r <= 0.802, n = 37, p < 0.001 for some disciplines. In our evaluation of ranking prediction performance based on citation data, we show we can reduce the mean rank prediction error by 25% compared to previous work. This suggests that citation-based indicators are sufficiently aligned with peer review results at the institutional level to be used to lessen the overall burden of peer review on national evaluation exercises leading to considerable cost savings
- …