4,003 research outputs found

    Applying economic evaluation to public health interventions: The case of interventions to promote physical activity

    Get PDF
    Copyright @ 2012 The Authors. This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/2.5/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.This article has been made available through the Brunel Open Access Publishing Fund.BACKGROUND: This paper explores the application of alternative approaches to economic evaluation of public health interventions, using a worked example of exercise referral schemes (ERSs). METHODS: Cost-utility (CUA) and cost-consequence analyses (CCA) were used to assess the cost-effectiveness of ERSs. For the CUA, evidence was synthesized using a decision analytic model that adopts a lifetime horizon and NHS/Personal Social Services perspective. Outcomes were expressed as incremental cost per quality-adjusted life-year (QALY). CCA was conducted from a partial-societal perspective, including health and non-healthcare costs and benefits. Outcomes were reported in natural units, such as cases of strokes or CHD avoided. RESULTS: Compared with usual care, the incremental cost per QALY of ERS is £20 876. Based on a cohort of 100 000 individuals, CCA estimates cost of ERS at £22 million to the healthcare provider and £12 million to participants. The benefits of ERS include additional 3900 people becoming physically active, 51 cases of CHD avoided, 16 cases of stroke avoided, 86 cases of diabetes avoided and a gain of ∼800 QALYs. CONCLUSIONS: CCA might provide greater transparency than CUA in reporting the outcomes of public health interventions and have greater resonance with stakeholders involved in commissioning these interventions.This work was supported by the NIHR Health Technology Assessment programme (project number 08/72/01). This article is made available through the Brunel Open Access Publishing Fund

    Finding the Median (Obliviously) with Bounded Space

    Full text link
    We prove that any oblivious algorithm using space SS to find the median of a list of nn integers from {1,...,2n}\{1,...,2n\} requires time Ω(nloglogSn)\Omega(n \log\log_S n). This bound also applies to the problem of determining whether the median is odd or even. It is nearly optimal since Chan, following Munro and Raman, has shown that there is a (randomized) selection algorithm using only ss registers, each of which can store an input value or O(logn)O(\log n)-bit counter, that makes only O(loglogsn)O(\log\log_s n) passes over the input. The bound also implies a size lower bound for read-once branching programs computing the low order bit of the median and implies the analog of PNPcoNPP \ne NP \cap coNP for length o(nloglogn)o(n \log\log n) oblivious branching programs

    Van der Waals Frictional Drag induced by Liquid Flow in Low- Dimensional Systems

    Get PDF
    We study the van der Waals frictional drag force induced by liquid flow in low-dimensional systems (2D and 1D electron systems, and 2D and 1D channels with liquid). We find that for both 1D and 2D systems, the frictional drag force induced by liquid flow may be several orders of magnitude larger than the frictional drag induced by electronic current.Comment: 10 pages, 4 figure

    Sediment resuspension and erosion by vortex rings

    Get PDF
    Particle resuspension and erosion induced by a vortex ringinteracting with a sediment layer was investigated experimentally using flow visualization (particle image velocimetry), high-speed video, and a recently developed light attenuation method for measuring displacements in bed level. Near-spherical sediment particles were used throughout with relative densities of 1.2–7 and diameters (d)(d) ranging between 90 and 1600 μm1600 μm. Attention was focused on initially smooth, horizontal bedforms with the vortex ring aligned to approach the bed vertically. Interaction characteristics were investigated in terms of the dimensionless Shields parameter, defined using the vortex-ring propagation speed. The critical conditions for resuspension (whereby particles are only just resuspended) were determined as a function of particle Reynolds number (based on the particle settling velocity and dd). The effects of viscous damping were found to be significant for d/δ<15d/δ<15, where δδ denotes the viscous sublayer thickness. Measurements of bed deformation were obtained during the interaction period, for a range of impact conditions. The (azimuthal) mean crater profile is shown to be generally self-similar during the interaction period, except for the most energetic impacts and larger sediment types. Loss of similarity occurs when the local bed slope approaches the repose limit, leading to collapse. Erosion, deposition, and resuspension volumes are analyzed as a function interaction time, impact condition, and sediment size

    Entanglement generation in persistent current qubits

    Full text link
    In this paper we investigate the generation of entanglement between two persistent current qubits. The qubits are coupled inductively to each other and to a common bias field, which is used to control the qubit behaviour and is represented schematically by a linear oscillator mode. We consider the use of classical and quantum representations for the qubit control fields and how fluctuations in the control fields tend to suppress entanglement. In particular, we demonstrate how fluctuations in the bias fields affect the entanglement generated between persistent current qubits and may limit the ability to design practical systems.Comment: 7 pages, 4 figures, minor changes in reply to referees comment

    High Temperature Macroscopic Entanglement

    Full text link
    In this paper I intend to show that macroscopic entanglement is possible at high temperatures. I analyze multipartite entanglement produced by the η\eta pairing mechanism which features strongly in the fermionic lattice models of high TcT_c superconductivity. This problem is shown to be equivalent to calculating multipartite entanglement in totally symmetric states of qubits. I demonstrate that we can conclusively calculate the relative entropy of entanglement within any subset of qubits in an overall symmetric state. Three main results then follow. First, I show that the condition for superconductivity, namely the existence of the off diagonal long range order (ODLRO), is not dependent on two-site entanglement, but on just classical correlations as the sites become more and more distant. Secondly, the entanglement that does survive in the thermodynamical limit is the entanglement of the total lattice and, at half filling, it scales with the log of the number of sites. It is this entanglement that will exist at temperatures below the superconducting critical temperature, which can currently be as high as 160 Kelvin. Thirdly, I prove that a complete mixture of symmetric states does not contain any entanglement in the macroscopic limit. On the other hand, the same mixture of symmetric states possesses the same two qubit entanglement features as the pure states involved, in the sense that the mixing does not destroy entanglement for finite number of qubits, albeit it does decrease it. Maximal mixing of symmetric states also does not destroy ODLRO and classical correlations. I discuss various other inequalities between different entanglements as well as generalizations to the subsystems of any dimensionality (i.e. higher than spin half).Comment: 14 pages, no figure

    Proteomic profiling of urinary proteins in renal cancer by surface enhanced laser desorption ionisation (SELDI) and neural-network analysis: Identification of key issues affecting potential clinical utility.

    No full text
    Recent advances in proteomic profiling technologies, such as surface enhanced laser desorption ionization mass spectrometry, have allowed preliminary profiling and identification of tumor markers in biological fluids in several cancer types and establishment of clinically useful diagnostic computational models. There are currently no routinely used circulating tumor markers for renal cancer, which is often detected incidentally and is frequently advanced at the time of presentation with over half of patients having local or distant tumor spread. We have investigated the clinical utility of surface enhanced laser desorption ionization profiling of urine samples in conjunction with neural-network analysis to either detect renal cancer or to identify proteins of potential use as markers, using samples from a total of 218 individuals, and examined critical technical factors affecting the potential utility of this approach. Samples from patients before undergoing nephrectomy for clear cell renal cell carcinoma (RCC; n 48), normal volunteers (n 38), and outpatients attending with benign diseases of the urogenital tract (n 20) were used to successfully train neural-network models based on either presence/absence of peaks or peak intensity values, resulting in sensitivity and specificity values of 98.3–100%. Using an initial “blind” group of samples from 12 patients with RCC, 11 healthy controls, and 9 patients with benign diseases to test the models, sensitivities and specificities of 81.8–83.3% were achieved. The robustness of the approach was subsequently evaluated with a group of 80 samples analyzed “blind” 10 months later, (36 patients with RCC, 31 healthy volunteers, and 13 patients with benign urological conditions). However, sensitivities and specificities declined markedly, ranging from 41.0% to 76.6%. Possible contributing factors including sample stability, changing laser performance, and chip variability were examined, which may be important for the long-term robustness of such approaches, and this study highlights the need for rigorous evaluation of such factors in future studies

    Integration of highly probabilistic sources into optical quantum architectures: perpetual quantum computation

    Full text link
    In this paper we introduce a design for an optical topological cluster state computer constructed exclusively from a single quantum component. Unlike previous efforts we eliminate the need for on demand, high fidelity photon sources and detectors and replace them with the same device utilised to create photon/photon entanglement. This introduces highly probabilistic elements into the optical architecture while maintaining complete specificity of the structure and operation for a large scale computer. Photons in this system are continually recycled back into the preparation network, allowing for a arbitrarily deep 3D cluster to be prepared using a comparatively small number of photonic qubits and consequently the elimination of high frequency, deterministic photon sources.Comment: 19 pages, 13 Figs (2 Appendices with additional Figs.). Comments welcom
    corecore