1,728 research outputs found

    Long-time Low-latency Quantum Memory by Dynamical Decoupling

    Get PDF
    Quantum memory is a central component for quantum information processing devices, and will be required to provide high-fidelity storage of arbitrary states, long storage times and small access latencies. Despite growing interest in applying physical-layer error-suppression strategies to boost fidelities, it has not previously been possible to meet such competing demands with a single approach. Here we use an experimentally validated theoretical framework to identify periodic repetition of a high-order dynamical decoupling sequence as a systematic strategy to meet these challenges. We provide analytic bounds-validated by numerical calculations-on the characteristics of the relevant control sequences and show that a "stroboscopic saturation" of coherence, or coherence plateau, can be engineered, even in the presence of experimental imperfection. This permits high-fidelity storage for times that can be exceptionally long, meaning that our device-independent results should prove instrumental in producing practically useful quantum technologies.Comment: abstract and authors list fixe

    Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?

    Get PDF
    © 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. © 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio

    The merger that led to the formation of the Milky Way's inner stellar halo and thick disk

    Get PDF
    The assembly process of our Galaxy can be retrieved using the motions and chemistry of individual stars. Chemo-dynamical studies of the nearby halo have long hinted at the presence of multiple components such as streams, clumps, duality and correlations between the stars' chemical abundances and orbital parameters. More recently, the analysis of two large stellar surveys have revealed the presence of a well-populated chemical elemental abundance sequence, of two distinct sequences in the colour-magnitude diagram, and of a prominent slightly retrograde kinematic structure all in the nearby halo, which may trace an important accretion event experienced by the Galaxy. Here report an analysis of the kinematics, chemistry, age and spatial distribution of stars in a relatively large volume around the Sun that are mainly linked to two major Galactic components, the thick disk and the stellar halo. We demonstrate that the inner halo is dominated by debris from an object which at infall was slightly more massive than the Small Magellanic Cloud, and which we refer to as Gaia-Enceladus. The stars originating in Gaia-Enceladus cover nearly the full sky, their motions reveal the presence of streams and slightly retrograde and elongated trajectories. Hundreds of RR Lyrae stars and thirteen globular clusters following a consistent age-metallicity relation can be associated to Gaia-Enceladus on the basis of their orbits. With an estimated 4:1 mass-ratio, the merger with Gaia-Enceladus must have led to the dynamical heating of the precursor of the Galactic thick disk and therefore contributed to the formation of this component approximately 10 Gyr ago. These findings are in line with simulations of galaxy formation, which predict that the inner stellar halo should be dominated by debris from just a few massive progenitors.Comment: 19 pages, 8 figures. Published in Nature in the issue of Nov. 1st, 2018. This is the authors' version before final edit

    Dairy products, calcium and prostate cancer risk

    Get PDF
    In a prospective study of 10 011 men with 815 prostate cancer cases, despite plausible biological mechanisms, neither increasing intake levels of dairy products nor calcium from dairy products (P trend; 0.23 and 0.64, respectively), or calcium supplements was associated with prostate cancer risk (relative risk, 1.05; 95% confidence interval, 0.84–1.31)

    Feedback as intervention for team learning in virtual teams: the role of team cohesion and personality

    Get PDF
    Scholars and practitioners agree that virtual teams (VTs) have become commonplace in today's digital workplace. Relevant literature argues that learning constitutes a significant contributor to team member satisfaction and performance, and that, at least in face-to-face teams, team cohesion fosters team learning. Given the additional challenges VTs face, e.g. geographical dispersion, which are likely have a negative influence on cohesion, in this paper we shed light on the relationship between team cohesion and team learning. We adopted a quantitative approach and studied 54 VTs in our quest to understand the role of feedback in mediating this relationship and, more specifically, the role of personality traits in moderating the indirect effect of team feedback and guided reflection intervention on TL through team cohesion within the VT context. Our findings highlight the importance of considering aspects related to the team composition when devising intervention strategies for VTs, and provide empirical support for an interactionist model between personality and emergent states such as cohesion. Implications for theory and practice are also discussed
    corecore