1,421 research outputs found

    Use of Quadrupolar Nuclei for Quantum Information processing by Nuclear Magnetic Resonance: Implementation of a Quantum Algorithm

    Get PDF
    Physical implementation of Quantum Information Processing (QIP) by liquid-state Nuclear Magnetic Resonance (NMR), using weakly coupled spin-1/2 nuclei of a molecule, is well established. Nuclei with spin>>1/2 oriented in liquid crystalline matrices is another possibility. Such systems have multiple qubits per nuclei and large quadrupolar couplings resulting in well separated lines in the spectrum. So far, creation of pseudopure states and logic gates have been demonstrated in such systems using transition selective radio-frequency pulses. In this paper we report two novel developments. First, we implement a quantum algorithm which needs coherent superposition of states. Second, we use evolution under quadrupolar coupling to implement multi qubit gates. We implement Deutsch-Jozsa algorithm on a spin-3/2 (2 qubit) system. The controlled-not operation needed to implement this algorithm has been implemented here by evolution under the quadrupolar Hamiltonian. This method has been implemented for the first time in quadrupolar systems. Since the quadrupolar coupling is several orders of magnitude greater than the coupling in weakly coupled spin-1/2 nuclei, the gate time decreases, increasing the clock speed of the quantum computer.Comment: 16 pages, 3 figure

    Inhaled PGE1 in neonates with hypoxemic respiratory failure: two pilot feasibility randomized clinical trials.

    Get PDF
    BackgroundInhaled nitric oxide (INO), a selective pulmonary vasodilator, has revolutionized the treatment of neonatal hypoxemic respiratory failure (NHRF). However, there is lack of sustained improvement in 30 to 46% of infants. Aerosolized prostaglandins I2 (PGI2) and E1 (PGE1) have been reported to be effective selective pulmonary vasodilators. The objective of this study was to evaluate the feasibility of a randomized controlled trial (RCT) of inhaled PGE1 (IPGE1) in NHRF.MethodsTwo pilot multicenter phase II RCTs are included in this report. In the first pilot, late preterm and term neonates with NHRF, who had an oxygenation index (OI) of ≥15 and <25 on two arterial blood gases and had not previously received INO, were randomly assigned to receive two doses of IPGE1 (300 and 150 ng/kg/min) or placebo. The primary outcome was the enrollment of 50 infants in six to nine months at 10 sites. The first pilot was halted after four months for failure to enroll a single infant. The most common cause for non-enrollment was prior initiation of INO. In a re-designed second pilot, co-administration of IPGE1 and INO was permitted. Infants with suboptimal response to INO received either aerosolized saline or IPGE1 at a low (150 ng/kg/min) or high dose (300 ng/kg/min) for a maximum duration of 72 hours. The primary outcome was the recruitment of an adequate number of patients (n = 50) in a nine-month-period, with fewer than 20% protocol violations.ResultsNo infants were enrolled in the first pilot. Seven patients were enrolled in the second pilot; three in the control, two in the low-dose IPGE1, and two in the high-dose IPGE1 groups. The study was halted for recruitment futility after approximately six months as enrollment targets were not met. No serious adverse events, one minor protocol deviation and one pharmacy protocol violation were reported.ConclusionsThese two pilot RCTs failed to recruit adequate eligible newborns with NHRF. Complex management RCTs of novel therapies for persistent pulmonary hypertension of the newborn (PPHN) may require novel study designs and a longer period of time from study approval to commencement of enrollment.Trial registrationCLINICALTRIALS.GOV: Pilot one: NCT number: 00598429 registered on 10 January 2008. Last updated: 3 February 2011. Pilot two: NCT number: 01467076 17 October 2011. Last updated: 13 February 2013

    Unconstrained Hamiltonian Formulation of SU(2) Gluodynamics

    Full text link
    SU(2) Yang-Mills field theory is considered in the framework of the generalized Hamiltonian approach and the equivalent unconstrained system is obtained using the method of Hamiltonian reduction. A canonical transformation to a set of adapted coordinates is performed in terms of which the Abelianization of the Gauss law constraints reduces to an algebraic operation and the pure gauge degrees of freedom drop out from the Hamiltonian after projection onto the constraint shell. For the remaining gauge invariant fields two representations are introduced where the three fields which transform as scalars under spatial rotations are separated from the three rotational fields. An effective low energy nonlinear sigma model type Lagrangian is derived which out of the six physical fields involves only one of the three scalar fields and two rotational fields summarized in a unit vector. Its possible relation to the effective Lagrangian proposed recently by Faddeev and Niemi is discussed. Finally the unconstrained analog of the well-known nonnormalizable groundstate wave functional which solves the Schr\"odinger equation with zero energy is given and analysed in the strong coupling limit.Comment: 20 pages REVTEX, no figures; final version to appear in Phys. Rev. D; minor changes, notations simplifie

    Helicity Modulus and Fluctuating Type II Superconductors: Elastic Approximation and Numerical Simulations

    Full text link
    We develop the helicity modulus as a criterion for superconducting order in the mixed phase of a fluctuating type II superconductor. We show that there is a duality relation between this helicity modulus and the superfluid density of a system of analog 2D bosons. We show that the vortex line lattice exhibits a perfect Meissner effect with respect to a shearing perturbation of the applied magnetic field, and this becomes our creterion for "longitudinal superconductivity" parallel to the applied field. We present arguments based on the 2D boson analogy, as well as the results of numerical simulations, that suggest that longitudinal superconductivity can persist into the vortex line liquid state for systems of finite thickness, comparable to those commonly found in experiments.Comment: 63 pages, 22 postscript figure

    Designing a complex intervention for dementia case management in primary care

    Get PDF
    Background: Community-based support will become increasingly important for people with dementia, but currently services are fragmented and the quality of care is variable. Case management is a popular approach to care co-ordination, but evidence to date on its effectiveness in dementia has been equivocal. Case management interventions need to be designed to overcome obstacles to care co-ordination and maximise benefit. A successful case management methodology was adapted from the United States (US) version for use in English primary care, with a view to a definitive trial. Medical Research Council guidance on the development of complex interventions was implemented in the adaptation process, to capture the skill sets, person characteristics and learning needs of primary care based case managers. Methods: Co-design of the case manager role in a single NHS provider organisation, with external peer review by professionals and carers, in an iterative technology development process. Results: The generic skills and personal attributes were described for practice nurses taking up the case manager role in their workplaces, and for social workers seconded to general practice teams, together with a method of assessing their learning needs. A manual of information material for people with dementia and their family carers was also created using the US intervention as its source. Conclusions: Co-design produces rich products that have face validity and map onto the complexities of dementia and of health and care services. The feasibility of the case manager role, as described and defined by this process, needs evaluation in ‘real life’ settings

    Spatial Epidemiology: Current Approaches and Future Challenges

    Get PDF
    Spatial epidemiology is the description and analysis of geographic variations in disease with respect to demographic, environmental, behavioral, socioeconomic, genetic, and infectious risk factors. We focus on small-area analyses, encompassing disease mapping, geographic correlation studies, disease clusters, and clustering. Advances in geographic information systems, statistical methodology, and availability of high-resolution, geographically referenced health and environmental quality data have created unprecedented new opportunities to investigate environmental and other factors in explaining local geographic variations in disease. They also present new challenges. Problems include the large random component that may predominate disease rates across small areas. Though this can be dealt with appropriately using Bayesian statistics to provide smooth estimates of disease risks, sensitivity to detect areas at high risk is limited when expected numbers of cases are small. Potential biases and confounding, particularly due to socioeconomic factors, and a detailed understanding of data quality are important. Data errors can result in large apparent disease excess in a locality. Disease cluster reports often arise nonsystematically because of media, physician, or public concern. One ready means of investigating such concerns is the replication of analyses in different areas based on routine data, as is done in the United Kingdom through the Small Area Health Statistics Unit (and increasingly in other European countries, e.g., through the European Health and Environment Information System collaboration). In the future, developments in exposure modeling and mapping, enhanced study designs, and new methods of surveillance of large health databases promise to improve our ability to understand the complex relationships of environment to health

    Smart homes and their users:a systematic analysis and key challenges

    Get PDF
    Published research on smart homes and their users is growing exponentially, yet a clear understanding of who these users are and how they might use smart home technologies is missing from a field being overwhelmingly pushed by technology developers. Through a systematic analysis of peer-reviewed literature on smart homes and their users, this paper takes stock of the dominant research themes and the linkages and disconnects between them. Key findings within each of nine themes are analysed, grouped into three: (1) views of the smart home-functional, instrumental, socio-technical; (2) users and the use of the smart home-prospective users, interactions and decisions, using technologies in the home; and (3) challenges for realising the smart home-hardware and software, design, domestication. These themes are integrated into an organising framework for future research that identifies the presence or absence of cross-cutting relationships between different understandings of smart homes and their users. The usefulness of the organising framework is illustrated in relation to two major concerns-privacy and control-that have been narrowly interpreted to date, precluding deeper insights and potential solutions. Future research on smart homes and their users can benefit by exploring and developing cross-cutting relationships between the research themes identified
    corecore