660 research outputs found

    Observation of coherent many-body Rabi oscillations

    Full text link
    A two-level quantum system coherently driven by a resonant electromagnetic field oscillates sinusoidally between the two levels at frequency Ω\Omega which is proportional to the field amplitude [1]. This phenomenon, known as the Rabi oscillation, has been at the heart of atomic, molecular and optical physics since the seminal work of its namesake and coauthors [2]. Notably, Rabi oscillations in isolated single atoms or dilute gases form the basis for metrological applications such as atomic clocks and precision measurements of physical constants [3]. Both inhomogeneous distribution of coupling strength to the field and interactions between individual atoms reduce the visibility of the oscillation and may even suppress it completely. A remarkable transformation takes place in the limit where only a single excitation can be present in the sample due to either initial conditions or atomic interactions: there arises a collective, many-body Rabi oscillation at a frequency N0.5ΩN^0.5\Omega involving all N >> 1 atoms in the sample [4]. This is true even for inhomogeneous atom-field coupling distributions, where single-atom Rabi oscillations may be invisible. When one of the two levels is a strongly interacting Rydberg level, many-body Rabi oscillations emerge as a consequence of the Rydberg excitation blockade. Lukin and coauthors outlined an approach to quantum information processing based on this effect [5]. Here we report initial observations of coherent many-body Rabi oscillations between the ground level and a Rydberg level using several hundred cold rubidium atoms. The strongly pronounced oscillations indicate a nearly complete excitation blockade of the entire mesoscopic ensemble by a single excited atom. The results pave the way towards quantum computation and simulation using ensembles of atoms

    Interventions aimed at improving the nursing work environment: a systematic review

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Nursing work environments (NWEs) in Canada and other Western countries have increasingly received attention following years of restructuring and reported high workloads, high absenteeism, and shortages of nursing staff. Despite numerous efforts to improve NWEs, little is known about the effectiveness of interventions to improve NWEs. The aim of this study was to review systematically the scientific literature on implemented interventions aimed at improving the NWE and their effectiveness.</p> <p>Methods</p> <p>An online search of the databases CINAHL, Medline, Scopus, ABI, Academic Search Complete, HEALTHstar, ERIC, Psychinfo, and Embase, and a manual search of Emerald and Longwoods was conducted. (Quasi-) experimental studies with pre/post measures of interventions aimed at improving the NWE, study populations of nurses, and quantitative outcome measures of the nursing work environment were required for inclusion. Each study was assessed for methodological strength using a quality assessment and validity tool for intervention studies. A taxonomy of NWE characteristics was developed that would allow us to identify on which part of the NWE an intervention targeted for improvement, after which the effects of the interventions were examined.</p> <p>Results</p> <p>Over 9,000 titles and abstracts were screened. Eleven controlled intervention studies met the inclusion criteria, of which eight used a quasi-experimental design and three an experimental design. In total, nine different interventions were reported in the included studies. The most effective interventions at improving the NWE were: primary nursing (two studies), the educational toolbox (one study), the individualized care and clinical supervision (one study), and the violence prevention intervention (one study).</p> <p>Conclusions</p> <p>Little is known about the effectiveness of interventions aimed at improving the NWE, and published studies on this topic show weaknesses in their design. To advance the field, we recommend that investigators use controlled studies with pre/post measures to evaluate interventions that are aimed at improving the NWE. Thereby, more evidence-based knowledge about the implementation of interventions will become available for healthcare leaders to use in rebuilding nursing work environments.</p

    Evaluation of a Medical and Mental Health Unit compared with standard care for older people whose emergency admission to an acute general hospital is complicated by concurrent 'confusion': a controlled clinical trial. Acronym: TEAM: Trial of an Elderly Acute care Medical and mental health unit

    Get PDF
    Background: Patients with delirium and dementia admitted to general hospitals have poor outcomes, and their carers report poor experiences. We developed an acute geriatric medical ward into a specialist Medical and Mental Health Unit over an eighteen month period. Additional specialist mental health staff were employed, other staff were trained in the ‘person-centred’ dementia care approach, a programme of meaningful activity was devised, the environment adapted to the needs of people with cognitive impairment, and attention given to communication with family carers. We hypothesise that patients managed on this ward will have better outcomes than those receiving standard care, and that such care will be cost-effective. Methods/design: We will perform a controlled clinical trial comparing in-patient management on a specialist Medical and Mental Health Unit with standard care. Study participants are patients over the age of 65, admitted as an emergency to a single general hospital, and identified on the Acute Medical Admissions Unit as being ‘confused’. Sample size is 300 per group. The evaluation design has been adapted to accommodate pressures on bed management and patient flows. If beds are available on the specialist Unit, the clinical service allocates patients at random between the Unit and standard care on general or geriatric medical wards. Once admitted, randomised patients and their carers are invited to take part in a follow up study, and baseline data are collected. Quality of care and patient experience are assessed in a non-participant observer study. Outcomes are ascertained at a follow up home visit 90 days after randomisation, by a researcher blind to allocation. The primary outcome is days spent at home (for those admitted from home), or days spent in the same care home (if admitted from a care home). Secondary outcomes include mortality, institutionalisation, resource use, and scaled outcome measures, including quality of life, cognitive function, disability, behavioural and psychological symptoms, carer strain and carer satisfaction with hospital care. Analyses will comprise comparisons of process, outcomes and costs between the specialist unit and standard care treatment groups. Trial Registration number: ClinicalTrials.gov: NCT0113614

    Can Plan Recommendations Improve the Coverage Decisions of Vulnerable Populations in Health Insurance Marketplaces?

    Get PDF
    OBJECTIVE: The Affordable Care Act's marketplaces present an important opportunity for expanding coverage but consumers face enormous challenges in navigating through enrollment and re-enrollment. We tested the effectiveness of a behaviorally informed policy tool--plan recommendations--in improving marketplace decisions. STUDY SETTING: Data were gathered from a community sample of 656 lower-income, minority, rural residents of Virginia. STUDY DESIGN: We conducted an incentive-compatible, computer-based experiment using a hypothetical marketplace like the one consumers face in the federally-facilitated marketplaces, and examined their decision quality. Participants were randomly assigned to a control condition or three types of plan recommendations: social normative, physician, and government. For participants randomized to a plan recommendation condition, the plan that maximized expected earnings, and minimized total expected annual health care costs, was recommended. DATA COLLECTION: Primary data were gathered using an online choice experiment and questionnaire. PRINCIPAL FINDINGS: Plan recommendations resulted in a 21 percentage point increase in the probability of choosing the earnings maximizing plan, after controlling for participant characteristics. Two conditions, government or providers recommending the lowest cost plan, resulted in plan choices that lowered annual costs compared to marketplaces where no recommendations were made. CONCLUSIONS: As millions of adults grapple with choosing plans in marketplaces and whether to switch plans during open enrollment, it is time to consider marketplace redesigns and leverage insights from the behavioral sciences to facilitate consumers' decisions

    The Signaller's Dilemma: A Cost–Benefit Analysis of Public and Private Communication

    Get PDF
    Understanding the diversity of animal signals requires knowledge of factors which may influence the different stages of communication, from the production of a signal by the sender up to the detection, identification and final decision-making in the receiver. Yet, many studies on signalling systems focus exclusively on the sender, and often ignore the receiver side and the ecological conditions under which signals evolve.We study a neotropical katydid which uses airborne sound for long distance communication, but also an alternative form of private signalling through substrate vibration. We quantified the strength of predation by bats which eavesdrop on the airborne sound signal, by analysing insect remains at roosts of a bat family. Males do not arbitrarily use one or the other channel for communication, but spend more time with private signalling under full moon conditions, when the nocturnal rainforest favours predation by visually hunting predators. Measurements of metabolic CO(2)-production rate indicate that the energy necessary for signalling increases 3-fold in full moon nights when private signalling is favoured. The background noise level for the airborne sound channel can amount to 70 dB SPL, whereas it is low in the vibration channel in the low frequency range of the vibration signal. The active space of the airborne sound signal varies between 22 and 35 meters, contrasting with about 4 meters with the vibration signal transmitted on the insect's favourite roost plant. Signal perception was studied using neurophysiological methods under outdoor conditions, which is more reliable for the private mode of communication.Our results demonstrate the complex effects of ecological conditions, such as predation, nocturnal ambient light levels, and masking noise levels on the performance of receivers in detecting mating signals, and that the net advantage or disadvantage of a mode of communication strongly depends on these conditions

    Emergent versus delayed lithotripsy for obstructing ureteral stones: a cumulative analysis of comparative studies

    Get PDF
    Objective To analyze the current evidence on the use of ureteroscopy (URS) and extracorporeal shock wave lithotripsy (ESWL) for the management of obstructing ureteral stones in emergent setting. Methods A systematic literature review was performed up to June 2016 using Pubmed and Ovid databases to identify pertinent studies. The PRISMA criteria were followed for article selection. Separate searches were done using a combinations of several search terms: "laser lithotripsy", "ureteroscopy", "extracorporeal shock wave lithotripsy", "ESWL", "rapid", "immediate", "early", "delayed", "late", "ureteral stones", "kidney stones", "renal stones". Only titles related to emergent/rapid/immediate/early (as viably defined in each study) versus delayed/late treatment of ureteral stones with either URS and/or ESWL were considered for screening. Demographics and operative outcomes were compared between emergent and delayed lithotripsy. RevMan review manager software was used to perform data analysis. Results Four studies comparing emergent (n = 526) versus delayed (n = 987) URS and six studies comparing emergent (n = 356) versus delayed (n = 355) SWL were included in the analysis. Emergent URS did not show any significant difference in terms of stone-free rate (91.2 versus 90.9%; OR 1.04; CI 0.71, 1.52; p = 0.84), complication rate (8.7% for emergent versus 11.5% for delayed; OR 0.94; CI 0.65, 1.36; p = 0.74) and need for auxiliary procedures (OR 0.85; CI 0.42, 1.7; p = 0.85) when compared to delayed URS. Emergent ESWL was associated with a higher likelihood of stone free status (OR 2.2; CI 1.55, 3.17; p < 0.001) and a lower likelihood of need for auxiliary maneuvers (OR 0.49; CI 0.33, 0.72; p < 0.001) than the delayed procedure. No differences in complication rates were noticed between the emergent and delayed ESWL (p = 0.37). Conclusions Emergent lithotripsy, either ureteroscopic or extracorporeal, can be offered as an effective and safe treatment for patients with symptomatic ureteral stone. If amenable to ESWL, based on stone and patient characteristics, an emergent approach should be strongly considered. Ureteroscopy in the emergent setting is mostly reserved for distally located stones. The implementation of these therapeutic approaches is likely to be dictated by their availability.info:eu-repo/semantics/publishedVersio

    Sociodemographic and geographic characteristics associated with patient visits to osteopathic physicians for primary care

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Health care reform promises to dramatically increase the number of Americans covered by health insurance. Osteopathic physicians (DOs) are recognized for primary care, including a "hands-on" style with an emphasis on patient-centered care. Thus, DOs may be well positioned to deliver primary care in this emerging health care environment.</p> <p>Methods</p> <p>We used data from the National Ambulatory Medical Care Survey (2002-2006) to study sociodemographic and geographic characteristics associated with patient visits to DOs for primary care. Descriptive analyses were initially performed to derive national population estimates (NPEs) for overall patient visits, primary care patient visits, and patient visits according to specialty status. Osteopathic and allopathic physician (MD) patient visits were compared using cross-tabulations and multiple logistic regression to compute odds ratios (ORs) and 95% confidence intervals (CIs) for DO patient visits. The latter analyses were also conducted separately for each geographic characteristic to assess the potential for effect modification based on these factors.</p> <p>Results</p> <p>Overall, 134,369 ambulatory medical care visits were surveyed, representing 4.6 billion (NPE) ± 220 million (SE) patient visits when patient visit weights were applied. Osteopathic physicians provided 336 million ± 30 million (7%) of these patient visits. Osteopathic physicians provided 217 million ± 21 million (10%) patient visits for primary care services; including 180 million ± 17 million (12%) primary care visits for adults (21 years of age or older) and 37 million ± 5 million (5%) primary care visits for minors. Osteopathic physicians were more likely than MDs to provide primary care visits in family and general medicine (OR, 6.03; 95% CI, 4.67-7.78), but were less likely to provide visits in internal medicine (OR, 0.37; 95% CI, 0.24-0.58) or pediatrics (OR, 0.21; 95% CI, 0.11-0.40). Overall, patients in the pediatric and geriatric ages, Blacks, Hispanics, and persons in the South and West were less likely to utilize DOs, although there was some evidence of effect modification according to United States Census region.</p> <p>Conclusions</p> <p>Health care reform provides unprecedented opportunities for DOs to reach historically underserved populations and to overcome the "pediatric primary-care paradox."</p

    Dementia in Swedish Twins: Predicting Incident Cases

    Get PDF
    Thirty same-sex twin pairs were identified in which both members were assessed at baseline and one twin subsequently developed dementia, at least 3 years subsequent to the baseline measurement, while the partner remained cognitively intact for at least three additional years. Eighteen of the 30 cases were diagnosed with Alzheimer’s disease. Baseline assessments, conducted when twins’ average age was 70.6 (SD = 6.8), included a mailed questionnaire and in-person testing. Which twin would develop dementia was predicted by less favorable lipid values (higher apoB, ratio of apoB to apoA1, and total cholesterol), poorer grip strength, and—to a lesser extent—higher emotionality on the EAS Temperament Scale. Given the long preclinical period that characterizes Alzheimer’s disease, these findings may suggest late life risk factors for dementia, or may reflect changes that are part of preclinical disease

    Comparative genomics of the bacterial genus Listeria: Genome evolution is characterized by limited gene acquisition and limited gene loss

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The bacterial genus <it>Listeria </it>contains pathogenic and non-pathogenic species, including the pathogens <it>L. monocytogenes </it>and <it>L. ivanovii</it>, both of which carry homologous virulence gene clusters such as the <it>prfA </it>cluster and clusters of internalin genes. Initial evidence for multiple deletions of the <it>prfA </it>cluster during the evolution of <it>Listeria </it>indicates that this genus provides an interesting model for studying the evolution of virulence and also presents practical challenges with regard to definition of pathogenic strains.</p> <p>Results</p> <p>To better understand genome evolution and evolution of virulence characteristics in <it>Listeria</it>, we used a next generation sequencing approach to generate draft genomes for seven strains representing <it>Listeria </it>species or clades for which genome sequences were not available. Comparative analyses of these draft genomes and six publicly available genomes, which together represent the main <it>Listeria </it>species, showed evidence for (i) a pangenome with 2,032 core and 2,918 accessory genes identified to date, (ii) a critical role of gene loss events in transition of <it>Listeria </it>species from facultative pathogen to saprotroph, even though a consistent pattern of gene loss seemed to be absent, and a number of isolates representing non-pathogenic species still carried some virulence associated genes, and (iii) divergence of modern pathogenic and non-pathogenic <it>Listeria </it>species and strains, most likely circa 47 million years ago, from a pathogenic common ancestor that contained key virulence genes.</p> <p>Conclusions</p> <p>Genome evolution in <it>Listeria </it>involved limited gene loss and acquisition as supported by (i) a relatively high coverage of the predicted pan-genome by the observed pan-genome, (ii) conserved genome size (between 2.8 and 3.2 Mb), and (iii) a highly syntenic genome. Limited gene loss in <it>Listeria </it>did include loss of virulence associated genes, likely associated with multiple transitions to a saprotrophic lifestyle. The genus <it>Listeria </it>thus provides an example of a group of bacteria that appears to evolve through a loss of virulence rather than acquisition of virulence characteristics. While <it>Listeria </it>includes a number of species-like clades, many of these putative species include clades or strains with atypical virulence associated characteristics. This information will allow for the development of genetic and genomic criteria for pathogenic strains, including development of assays that specifically detect pathogenic <it>Listeria </it>strains.</p

    An Indirect Cue of Predation Risk Counteracts Female Preference for Conspecifics in a Naturally Hybridizing Fish Xiphophorus birchmanni

    Get PDF
    Mate choice is context dependent, but the importance of current context to interspecific mating and hybridization is largely unexplored. An important influence on mate choice is predation risk. We investigated how variation in an indirect cue of predation risk, distance to shelter, influences mate choice in the swordtail Xiphophorus birchmanni, a species which sometimes hybridizes with X. malinche in the wild. We conducted mate choice experiments to determine whether females attend to the distance to shelter and whether this cue of predation risk can counteract female preference for conspecifics. Females were sensitive to shelter distance independent of male presence. When conspecific and heterospecific X. malinche males were in equally risky habitats (i.e., equally distant from shelter), females associated primarily with conspecifics, suggesting an innate preference for conspecifics. However, when heterospecific males were in less risky habitat (i.e., closer to shelter) than conspecific males, females no longer exhibited a preference, suggesting that females calibrate their mate choices in response to predation risk. Our findings illustrate the potential for hybridization to arise, not necessarily through reproductive “mistakes”, but as one of many potential outcomes of a context-dependent mate choice strategy
    corecore