2,186 research outputs found

    From evidence-base to practice: implementation of the Nurse Family Partnership programme in England

    Get PDF
    The aims of this article are to highlight the issues that are relevant to the implementation of a rigorously evidence-based programme of support, the Nurse Family Partnership programme, into a national system of care. Methods used are semi-structured interviews with families in receipt of the programme in the first 10 sites, with the nursing staff, with members of the central team guiding the initiative and with other professionals. Analyses of data collected during programme delivery evaluate fidelity of delivery. The results indicate that the programme is perceived in a positive light and take-up is high, with delivery close to the stated US objectives. Issues pertaining to sustainability are highlighted - in particular, local concerns about cost set against long-term rather than immediate gains. However, local investment is predominantly strong, with creative methods being planned for the future. Overall, the study shows that within an NHS system of care it is possible to deliver a targeted evidence-based programme

    Electronic transport in polycrystalline graphene

    Full text link
    Most materials in available macroscopic quantities are polycrystalline. Graphene, a recently discovered two-dimensional form of carbon with strong potential for replacing silicon in future electronics, is no exception. There is growing evidence of the polycrystalline nature of graphene samples obtained using various techniques. Grain boundaries, intrinsic topological defects of polycrystalline materials, are expected to dramatically alter the electronic transport in graphene. Here, we develop a theory of charge carrier transmission through grain boundaries composed of a periodic array of dislocations in graphene based on the momentum conservation principle. Depending on the grain boundary structure we find two distinct transport behaviours - either high transparency, or perfect reflection of charge carriers over remarkably large energy ranges. First-principles quantum transport calculations are used to verify and further investigate this striking behaviour. Our study sheds light on the transport properties of large-area graphene samples. Furthermore, purposeful engineering of periodic grain boundaries with tunable transport gaps would allow for controlling charge currents without the need of introducing bulk band gaps in otherwise semimetallic graphene. The proposed approach can be regarded as a means towards building practical graphene electronics.Comment: accepted in Nature Material

    Prey resources are equally important as climatic conditions for predicting the distribution of a broad-ranged apex predator

    Get PDF
    Aim A current biogeographic paradigm states that climate regulates species distributions at continental scales and that biotic interactions are undetectable at coarse-grain extents. However, advances in spatial modelling show that incorporating food resource distributions are important for improving model predictions at large distribution scales. This is particularly relevant to understand the factors limiting distribution of widespread apex predators whose diets are likely to vary across their range. Location Neotropical Central and South America Methods The harpy eagle (Harpia harpyja) is a large raptor, whose diet is largely comprised of arboreal mammals, all with broad distributions across Neotropical lowland forest. Here, we used a hierarchical modelling approach to determine the relative importance of abiotic factors and prey resource distribution on harpy eagle range limits. Our hierarchical approach consisted of the following modelling sequence of explanatory variables: (a) abiotic covariates, (b) prey resource distributions predicted by an equivalent modelling for each prey, (c) the combination of (a) and (b), and (d) as in (c) but with prey resources considered as a single prediction equivalent to prey species richness. Results Incorporating prey distributions improved model predictions but using solely biotic covariates still resulted in a high performing model. In the Abiotic model, Climatic Moisture Index (CMI) was the most important predictor, contributing 76 % to model prediction. Three-toed sloth (Bradypus spp.) was the most important prey resource, contributing 64 % in a combined Abiotic-Biotic model, followed by CMI contributing 30 %. Harpy eagle distribution had high environmental overlap across all individual prey distributions, with highest coincidence through Central America, eastern Colombia, and across the Guiana Shield into northern Amazonia. Main conclusions With strong reliance on prey distributions across its range, harpy eagle conservation programs must therefore consider its most important food resources as a key element in the protection of this threatened raptor

    Using informative behavior to increase engagement while learning from human reward

    Get PDF
    In this work, we address a relatively unexplored aspect of designing agents that learn from human reward. We investigate how an agent’s non-task behavior can affect a human trainer’s training and agent learning. We use the TAMER framework, which facilitates the training of agents by human-generated reward signals, i.e., judgements of the quality of the agent’s actions, as the foundation for our investigation. Then, starting from the premise that the interaction between the agent and the trainer should be bi-directional, we propose two new training interfaces to increase a human trainer’s active involvement in the training process and thereby improve the agent’s task performance. One provides information on the agent’s uncertainty which is a metric calculated as data coverage, the other on its performance. Our results from a 51-subject user study show that these interfaces can induce the trainers to train longer and give more feedback. The agent’s performance, however, increases only in response to the addition of performance-oriented information, not by sharing uncertainty levels. These results suggest that the organizational maxim about human behavior, “you get what you measure”—i.e., sharing metrics with people causes them to focus on optimizing those metrics while de-emphasizing other objectives—also applies to the training of agents. Using principle component analysis, we show how trainers in the two conditions train agents differently. In addition, by simulating the influence of the agent’s uncertainty–informative behavior on a human’s training behavior, we show that trainers could be distracted by the agent sharing its uncertainty levels about its actions, giving poor feedback for the sake of reducing the agent’s uncertainty without improving the agent’s performance

    Evidence synthesis as the key to more coherent and efficient research

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Systematic review and meta-analysis currently underpin much of evidence-based medicine. Such methodologies bring order to <it>previous </it>research, but <it>future </it>research planning remains relatively incoherent and inefficient.</p> <p>Methods</p> <p>To outline a framework for evaluation of health interventions, aimed at increasing coherence and efficiency through i) making better use of information contained within the existing evidence-base when designing future studies; and ii) maximising the information available and thus potentially reducing the need for future studies.</p> <p>Results</p> <p>The framework presented insists that an up-to-date meta-analysis of existing randomised controlled trials (RCTs) should always be considered before future trials are conducted. Such a meta-analysis should inform critical design issues such as sample size determination. The contexts in which the use of individual patient data meta-analysis and mixed treatment comparisons modelling may be beneficial before further RCTs are conducted are considered. Consideration should also be given to how any newly planned RCTs would contribute to the totality of evidence through its incorporation into an updated meta-analysis. We illustrate how new RCTs can have very low power to change inferences of an existing meta-analysis, particularly when between study heterogeneity is taken into consideration.</p> <p>Conclusion</p> <p>While the collation of existing evidence as the basis for clinical practice is now routine, a more coherent and efficient approach to planning future RCTs to strengthen the evidence base needs to be developed. The framework presented is a proposal for how this situation can be improved.</p

    Association between proton pump inhibitor therapy and clostridium difficile infection: a contemporary systematic review and meta-analysis.

    Get PDF
    Abstract Introduction Emerging epidemiological evidence suggests that proton pump inhibitor (PPI) acid-suppression therapy is associated with an increased risk of Clostridium difficile infection (CDI). Methods Ovid MEDLINE, EMBASE, ISI Web of Science, and Scopus were searched from 1990 to January 2012 for analytical studies that reported an adjusted effect estimate of the association between PPI use and CDI. We performed random-effect meta-analyses. We used the GRADE framework to interpret the findings. Results We identified 47 eligible citations (37 case-control and 14 cohort studies) with corresponding 51 effect estimates. The pooled OR was 1.65, 95% CI (1.47, 1.85), I2 = 89.9%, with evidence of publication bias suggested by a contour funnel plot. A novel regression based method was used to adjust for publication bias and resulted in an adjusted pooled OR of 1.51 (95% CI, 1.26–1.83). In a speculative analysis that assumes that this association is based on causality, and based on published baseline CDI incidence, the risk of CDI would be very low in the general population taking PPIs with an estimated NNH of 3925 at 1 year. Conclusions In this rigorously conducted systemic review and meta-analysis, we found very low quality evidence (GRADE class) for an association between PPI use and CDI that does not support a cause-effect relationship

    Single Gene Deletions of Orexin, Leptin, Neuropeptide Y, and Ghrelin Do Not Appreciably Alter Food Anticipatory Activity in Mice

    Get PDF
    Timing activity to match resource availability is a widely conserved ability in nature. Scheduled feeding of a limited amount of food induces increased activity prior to feeding time in animals as diverse as fish and rodents. Typically, food anticipatory activity (FAA) involves temporally restricting unlimited food access (RF) to several hours in the middle of the light cycle, which is a time of day when rodents are not normally active. We compared this model to calorie restriction (CR), giving the mice 60% of their normal daily calorie intake at the same time each day. Measurement of body temperature and home cage behaviors suggests that the RF and CR models are very similar but CR has the advantage of a clearly defined food intake and more stable mean body temperature. Using the CR model, we then attempted to verify the published result that orexin deletion diminishes food anticipatory activity (FAA) but observed little to no diminution in the response to CR and, surprisingly, that orexin KO mice are refractory to body weight loss on a CR diet. Next we tested the orexigenic neuropeptide Y (NPY) and ghrelin and the anorexigenic hormone, leptin, using mouse mutants. NPY deletion did not alter the behavior or physiological response to CR. Leptin deletion impaired FAA in terms of some activity measures, such as walking and rearing, but did not substantially diminish hanging behavior preceding feeding time, suggesting that leptin knockout mice do anticipate daily meal time but do not manifest the full spectrum of activities that typify FAA. Ghrelin knockout mice do not have impaired FAA on a CR diet. Collectively, these results suggest that the individual hormones and neuropepetides tested do not regulate FAA by acting individually but this does not rule out the possibility of their concerted action in mediating FAA

    Systematic review and meta-analysis of the diagnostic accuracy of ultrasonography for deep vein thrombosis

    Get PDF
    Background Ultrasound (US) has largely replaced contrast venography as the definitive diagnostic test for deep vein thrombosis (DVT). We aimed to derive a definitive estimate of the diagnostic accuracy of US for clinically suspected DVT and identify study-level factors that might predict accuracy. Methods We undertook a systematic review, meta-analysis and meta-regression of diagnostic cohort studies that compared US to contrast venography in patients with suspected DVT. We searched Medline, EMBASE, CINAHL, Web of Science, Cochrane Database of Systematic Reviews, Cochrane Controlled Trials Register, Database of Reviews of Effectiveness, the ACP Journal Club, and citation lists (1966 to April 2004). Random effects meta-analysis was used to derive pooled estimates of sensitivity and specificity. Random effects meta-regression was used to identify study-level covariates that predicted diagnostic performance. Results We identified 100 cohorts comparing US to venography in patients with suspected DVT. Overall sensitivity for proximal DVT (95% confidence interval) was 94.2% (93.2 to 95.0), for distal DVT was 63.5% (59.8 to 67.0), and specificity was 93.8% (93.1 to 94.4). Duplex US had pooled sensitivity of 96.5% (95.1 to 97.6) for proximal DVT, 71.2% (64.6 to 77.2) for distal DVT and specificity of 94.0% (92.8 to 95.1). Triplex US had pooled sensitivity of 96.4% (94.4 to 97.1%) for proximal DVT, 75.2% (67.7 to 81.6) for distal DVT and specificity of 94.3% (92.5 to 95.8). Compression US alone had pooled sensitivity of 93.8 % (92.0 to 95.3%) for proximal DVT, 56.8% (49.0 to 66.4) for distal DVT and specificity of 97.8% (97.0 to 98.4). Sensitivity was higher in more recently published studies and in cohorts with higher prevalence of DVT and more proximal DVT, and was lower in cohorts that reported interpretation by a radiologist. Specificity was higher in cohorts that excluded patients with previous DVT. No studies were identified that compared repeat US to venography in all patients. Repeat US appears to have a positive yield of 1.3%, with 89% of these being confirmed by venography. Conclusion Combined colour-doppler US techniques have optimal sensitivity, while compression US has optimal specificity for DVT. However, all estimates are subject to substantial unexplained heterogeneity. The role of repeat scanning is very uncertain and based upon limited data
    corecore