13,198 research outputs found
Recommended from our members
Using the EQ-5D as a performance measurement tool in the NHS
In a landmark move, the UK Department of Health (DH) is introducing the routine use of Patient Reported Outcome Measures (PROMs) as a means of measuring the performance of health care providers in improving patient health. From April 2009 all patients will be asked to complete both generic (EQ-5D) and condition specific PROMs before and after surgery for four elective procedures; the intention is to extend this to a wide range of other NHS services. The aim of this paper is to report analysis of the EQ-5D data generated from a pilot study commissioned by the DH, and to consider the implications of the results for their use as performance indicators and measures of patient benefit. The EQ-5D has the potential advantage in the context of PROMs of enabling comparisons of performance across services as well as between providers; and in facilitating assessments of the cost effectiveness of NHS services. We present two new methods we have developed for analysing and displaying EQ-5D profile data: a Paretian Classification of Health Change, and a Health Profile Grid. Using these methods, we show that EQ-5D data can readily be used to generate useful insights into differences between providers in improving overall changes in health; results are also suggestive of striking differences in changes in health between surgical procedures. We conclude by noting a number of issues that remain to be addressed in the use of PROMs data as a basis for performance indicators
The distributional effect of the 2008 Pre-Budget Report
The Pre-Budget Report given by the Chancellor on 24th November 2008
contained a number of changes to the tax and benefit system to come into effect
at various points over the next three years.
This briefing note expands on the information provided at a briefing given by
IFS researchers on the day after the Pre-Budget Report1. It gives details of the
changes to taxes, benefits and tax credits directly affecting households, and the
total distributional impact of measures announced in PBR 2008 together with
pre-announced changes, by income and expenditure decile and household type,
at three points in time – January 2009, April 2009 and April 2011.
It also discusses what PBR 2008 does to our impression of all tax and benefit
changes under this Government. Finally, it discusses what PBR 08 did for child
poverty in 2010/11 and the likely effects of the income tax changes for those
earning more than £100,000 a year
How good must single photon sources and detectors be for efficient linear optical quantum computation?
We present a scheme for linear optical quantum computation (LOQC) which is
highly robust to imperfect single photon sources and inefficient detectors. In
particular we show that if the product of the detector efficiency with the
source efficiency is greater than 2/3, then efficient LOQC is possible. This
threshold is many orders of magnitude more relaxed than those which could be
inferred by application of standard results in fault tolerance. The result is
achieved within the cluster state paradigm for quantum computation.Comment: New version contains an Added Appendi
First Principles LCGO Calculation of the Magneto-optical Properties of Nickel and Iron
We report a first principles, self-consistent, all electron, linear
combination of Gaussian orbitals (LCGO) calculation of a comprehensive
collection of magneto-optical properties of nickel and iron based on density
functional theory. Among the many magneto-optical effects, we have studied the
equatorial Kerr effect for absorption in the optical as well as soft X-ray
region, where it is called X-ray magnetic linear dichroism (X-MLD). In the
optical region the effect is of the order of 2\% while in the X-ray region it
is of the order of 1\% for the incident angles considered. In addition, the
polar Kerr effect, X-ray magnetic circular dichroism (X-MCD) and total X-ray
absorption at the L edges, soft X-ray Faraday effect at the L
edges have also been calculated. Our results are in good agreement with
experiments and other first principles methods that have been used to calculate
some of these properties.Comment: 22 pages RevTex. 8 figures submitted separately as a uuencoded,
compressed tar fil
Finding Optimal Flows Efficiently
Among the models of quantum computation, the One-way Quantum Computer is one
of the most promising proposals of physical realization, and opens new
perspectives for parallelization by taking advantage of quantum entanglement.
Since a one-way quantum computation is based on quantum measurement, which is a
fundamentally nondeterministic evolution, a sufficient condition of global
determinism has been introduced as the existence of a causal flow in a graph
that underlies the computation. A O(n^3)-algorithm has been introduced for
finding such a causal flow when the numbers of output and input vertices in the
graph are equal, otherwise no polynomial time algorithm was known for deciding
whether a graph has a causal flow or not. Our main contribution is to introduce
a O(n^2)-algorithm for finding a causal flow, if any, whatever the numbers of
input and output vertices are. This answers the open question stated by Danos
and Kashefi and by de Beaudrap. Moreover, we prove that our algorithm produces
an optimal flow (flow of minimal depth.)
Whereas the existence of a causal flow is a sufficient condition for
determinism, it is not a necessary condition. A weaker version of the causal
flow, called gflow (generalized flow) has been introduced and has been proved
to be a necessary and sufficient condition for a family of deterministic
computations. Moreover the depth of the quantum computation is upper bounded by
the depth of the gflow. However, the existence of a polynomial time algorithm
that finds a gflow has been stated as an open question. In this paper we answer
this positively with a polynomial time algorithm that outputs an optimal gflow
of a given graph and thus finds an optimal correction strategy to the
nondeterministic evolution due to measurements.Comment: 10 pages, 3 figure
Loss tolerant linear optical quantum memory by measurement-based quantum computing
We give a scheme for loss tolerantly building a linear optical quantum memory which itself is tolerant to qubit loss. We use the encoding recently introduced in Varnava et al 2006 Phys. Rev. Lett. 97 120501, and give a method for efficiently achieving this. The entire approach resides within the 'one-way' model for quantum computing (Raussendorf and Briegel 2001 Phys. Rev. Lett. 86 5188–91; Raussendorf et al 2003 Phys. Rev. A 68 022312). Our results suggest that it is possible to build a loss tolerant quantum memory, such that if the requirement is to keep the data stored over arbitrarily long times then this is possible with only polynomially increasing resources and logarithmically increasing individual photon life-times
- …