2,424 research outputs found
From evidence-base to practice: implementation of the Nurse Family Partnership programme in England
The aims of this article are to highlight the issues that are relevant to the implementation of a rigorously evidence-based programme of support, the Nurse Family Partnership programme, into a national system of care. Methods used are semi-structured interviews with families in receipt of the programme in the first 10 sites, with the nursing staff, with members of the central team guiding the initiative and with other professionals. Analyses of data collected during programme delivery evaluate fidelity of delivery. The results indicate that the programme is perceived in a positive light and take-up is high, with delivery close to the stated US objectives. Issues pertaining to sustainability are highlighted - in particular, local concerns about cost set against long-term rather than immediate gains. However, local investment is predominantly strong, with creative methods being planned for the future. Overall, the study shows that within an NHS system of care it is possible to deliver a targeted evidence-based programme
Electronic transport in polycrystalline graphene
Most materials in available macroscopic quantities are polycrystalline.
Graphene, a recently discovered two-dimensional form of carbon with strong
potential for replacing silicon in future electronics, is no exception. There
is growing evidence of the polycrystalline nature of graphene samples obtained
using various techniques. Grain boundaries, intrinsic topological defects of
polycrystalline materials, are expected to dramatically alter the electronic
transport in graphene. Here, we develop a theory of charge carrier transmission
through grain boundaries composed of a periodic array of dislocations in
graphene based on the momentum conservation principle. Depending on the grain
boundary structure we find two distinct transport behaviours - either high
transparency, or perfect reflection of charge carriers over remarkably large
energy ranges. First-principles quantum transport calculations are used to
verify and further investigate this striking behaviour. Our study sheds light
on the transport properties of large-area graphene samples. Furthermore,
purposeful engineering of periodic grain boundaries with tunable transport gaps
would allow for controlling charge currents without the need of introducing
bulk band gaps in otherwise semimetallic graphene. The proposed approach can be
regarded as a means towards building practical graphene electronics.Comment: accepted in Nature Material
Single Gene Deletions of Orexin, Leptin, Neuropeptide Y, and Ghrelin Do Not Appreciably Alter Food Anticipatory Activity in Mice
Timing activity to match resource availability is a widely conserved ability in nature. Scheduled feeding of a limited amount of food induces increased activity prior to feeding time in animals as diverse as fish and rodents. Typically, food anticipatory activity (FAA) involves temporally restricting unlimited food access (RF) to several
hours in the middle of the light cycle, which is a time of day when rodents are not normally active. We compared this model to calorie restriction (CR), giving the mice 60% of their normal daily calorie intake at the same time each day. Measurement of body temperature and home cage behaviors suggests that the RF and CR models are very similar but CR has the advantage of a clearly defined food intake and more stable mean body temperature. Using the CR model, we then attempted to verify the published result that orexin deletion diminishes food anticipatory activity (FAA) but observed little to no diminution in the response to CR and, surprisingly, that orexin KO mice are refractory to body weight loss on a CR diet. Next we tested the orexigenic neuropeptide Y (NPY) and ghrelin and the anorexigenic hormone, leptin, using mouse mutants. NPY deletion did not alter the behavior or physiological response to CR. Leptin deletion impaired FAA in terms of some activity measures, such as walking and rearing, but did not substantially diminish hanging behavior preceding feeding time, suggesting that leptin knockout mice do anticipate daily meal time but do not manifest the full spectrum of activities that typify FAA. Ghrelin knockout mice do not have impaired FAA on a CR diet. Collectively, these results suggest that the individual hormones and neuropepetides tested do not regulate FAA by acting individually but this does not rule out the possibility of their concerted action in mediating FAA
Association between proton pump inhibitor therapy and clostridium difficile infection: a contemporary systematic review and meta-analysis.
Abstract
Introduction
Emerging epidemiological evidence suggests that proton pump inhibitor (PPI) acid-suppression therapy is associated with an increased risk of Clostridium difficile infection (CDI).
Methods
Ovid MEDLINE, EMBASE, ISI Web of Science, and Scopus were searched from 1990 to January 2012 for analytical studies that reported an adjusted effect estimate of the association between PPI use and CDI. We performed random-effect meta-analyses. We used the GRADE framework to interpret the findings.
Results
We identified 47 eligible citations (37 case-control and 14 cohort studies) with corresponding 51 effect estimates. The pooled OR was 1.65, 95% CI (1.47, 1.85), I2 = 89.9%, with evidence of publication bias suggested by a contour funnel plot. A novel regression based method was used to adjust for publication bias and resulted in an adjusted pooled OR of 1.51 (95% CI, 1.26–1.83). In a speculative analysis that assumes that this association is based on causality, and based on published baseline CDI incidence, the risk of CDI would be very low in the general population taking PPIs with an estimated NNH of 3925 at 1 year.
Conclusions
In this rigorously conducted systemic review and meta-analysis, we found very low quality evidence (GRADE class) for an association between PPI use and CDI that does not support a cause-effect relationship
Planck intermediate results. XLI. A map of lensing-induced B-modes
The secondary cosmic microwave background (CMB) -modes stem from the
post-decoupling distortion of the polarization -modes due to the
gravitational lensing effect of large-scale structures. These lensing-induced
-modes constitute both a valuable probe of the dark matter distribution and
an important contaminant for the extraction of the primary CMB -modes from
inflation. Planck provides accurate nearly all-sky measurements of both the
polarization -modes and the integrated mass distribution via the
reconstruction of the CMB lensing potential. By combining these two data
products, we have produced an all-sky template map of the lensing-induced
-modes using a real-space algorithm that minimizes the impact of sky masks.
The cross-correlation of this template with an observed (primordial and
secondary) -mode map can be used to measure the lensing -mode power
spectrum at multipoles up to . In particular, when cross-correlating with
the -mode contribution directly derived from the Planck polarization maps,
we obtain lensing-induced -mode power spectrum measurement at a significance
level of , which agrees with the theoretical expectation derived
from the Planck best-fit CDM model. This unique nearly all-sky
secondary -mode template, which includes the lensing-induced information
from intermediate to small () angular scales, is
delivered as part of the Planck 2015 public data release. It will be
particularly useful for experiments searching for primordial -modes, such as
BICEP2/Keck Array or LiteBIRD, since it will enable an estimate to be made of
the lensing-induced contribution to the measured total CMB -modes.Comment: 20 pages, 12 figures; Accepted for publication in A&A; The B-mode map
is part of the PR2-2015 Cosmology Products; available as Lensing Products in
the Planck Legacy Archive http://pla.esac.esa.int/pla/#cosmology; and
described in the 'Explanatory Supplement'
https://wiki.cosmos.esa.int/planckpla2015/index.php/Specially_processed_maps#2015_Lensing-induced_B-mode_ma
Systematic review and meta-analysis of the diagnostic accuracy of ultrasonography for deep vein thrombosis
Background
Ultrasound (US) has largely replaced contrast venography as the definitive diagnostic test for deep vein thrombosis (DVT). We aimed to derive a definitive estimate of the diagnostic accuracy of US for clinically suspected DVT and identify study-level factors that might predict accuracy.
Methods
We undertook a systematic review, meta-analysis and meta-regression of diagnostic cohort studies that compared US to contrast venography in patients with suspected DVT. We searched Medline, EMBASE, CINAHL, Web of Science, Cochrane Database of Systematic Reviews, Cochrane Controlled Trials Register, Database of Reviews of Effectiveness, the ACP Journal Club, and citation lists (1966 to April 2004). Random effects meta-analysis was used to derive pooled estimates of sensitivity and specificity. Random effects meta-regression was used to identify study-level covariates that predicted diagnostic performance.
Results
We identified 100 cohorts comparing US to venography in patients with suspected DVT. Overall sensitivity for proximal DVT (95% confidence interval) was 94.2% (93.2 to 95.0), for distal DVT was 63.5% (59.8 to 67.0), and specificity was 93.8% (93.1 to 94.4). Duplex US had pooled sensitivity of 96.5% (95.1 to 97.6) for proximal DVT, 71.2% (64.6 to 77.2) for distal DVT and specificity of 94.0% (92.8 to 95.1). Triplex US had pooled sensitivity of 96.4% (94.4 to 97.1%) for proximal DVT, 75.2% (67.7 to 81.6) for distal DVT and specificity of 94.3% (92.5 to 95.8). Compression US alone had pooled sensitivity of 93.8 % (92.0 to 95.3%) for proximal DVT, 56.8% (49.0 to 66.4) for distal DVT and specificity of 97.8% (97.0 to 98.4). Sensitivity was higher in more recently published studies and in cohorts with higher prevalence of DVT and more proximal DVT, and was lower in cohorts that reported interpretation by a radiologist. Specificity was higher in cohorts that excluded patients with previous DVT. No studies were identified that compared repeat US to venography in all patients. Repeat US appears to have a positive yield of 1.3%, with 89% of these being confirmed by venography.
Conclusion
Combined colour-doppler US techniques have optimal sensitivity, while compression US has optimal specificity for DVT. However, all estimates are subject to substantial unexplained heterogeneity. The role of repeat scanning is very uncertain and based upon limited data
Recommended from our members
Decadal prediction of the North Atlantic subpolar gyre in the HiGEM high-resolution climate model
This paper presents an analysis of initialised decadal hindcasts of the North Atlantic subpolar gyre (SPG) using the HiGEM model, which has a nominal grid-spacing of 90 km in the atmosphere, and 1/3 ∘∘ in the ocean. HiGEM decadal predictions (HiGEM-DP) exhibit significant skill at capturing 0–500 m ocean heat content in the SPG, and outperform historically forced transient integrations and persistence for up to a decade ahead. An analysis of case-studies of North Atlantic decadal change, including the 1960s cooling, the mid-1990s warming, and the post-2005 cooling, show that changes in ocean circulation and heat transport dominate the predictions of the SPG. However, different processes are found to dominate heat content changes in different regions of the SPG. Specifically, ocean advection dominates in the east, but surface fluxes dominate in the west. Furthermore, compared to previous studies, we find a smaller role for ocean heat transport changes due to ocean circulation anomalies at the latitudes of the SPG, and, for the 1960s cooling, a greater role for surface fluxes. Finally, HiGEM-DP predicts the observed positive state of the North Atlantic Oscillation in the early 1990s. These results support an important role for the ocean in driving past changes in the North Atlantic region, and suggest that these changes were predictable
How do we create, and improve, the evidence base?
Providing best clinical care involves using the best available evidence of effectiveness to inform treatment decisions. Producing this evidence begins with trials and continues through synthesis of their findings towards evidence incorporation within comprehensible, usable guidelines, for clinicians and patients at the point of care. However, there is enormous wastage in this evidence production process, with less than 50% of the published biomedical literature considered sufficient in conduct and reporting to be fit for purpose. Over the last 30 years, independent collaborative initiatives have evolved to optimise the evidence to improve patient care. These collaborations each recommend how to improve research quality in a small way at many different stages of the evidence production and distillation process. When we consider these minimal improvements at each stage from an 'aggregation of marginal gains' perspective, the accumulation of small enhancements aggregates, thereby greatly improving the final product of 'best available evidence'. The myriad of tools to reduce research quality leakage and evidence loss should be routinely used by all those with responsibility for ensuring that research benefits patients, that is, those who pay for research (funders), produce it (researchers), take part in it (patients/participants) and use it (clinicians, policy makers and service commissioners)
Using informative behavior to increase engagement while learning from human reward
In this work, we address a relatively unexplored aspect of designing agents that learn from human reward. We investigate how an agent’s non-task behavior can affect a human trainer’s training and agent learning. We use the TAMER framework, which facilitates the training of agents by human-generated reward signals, i.e., judgements of the quality of the agent’s actions, as the foundation for our investigation. Then, starting from the premise that the interaction between the agent and the trainer should be bi-directional, we propose two new training interfaces to increase a human trainer’s active involvement in the training process and thereby improve the agent’s task performance. One provides information on the agent’s uncertainty which is a metric calculated as data coverage, the other on its performance. Our results from a 51-subject user study show that these interfaces can induce the trainers to train longer and give more feedback. The agent’s performance, however, increases only in response to the addition of performance-oriented information, not by sharing uncertainty levels. These results suggest that the organizational maxim about human behavior, “you get what you measure”—i.e., sharing metrics with people causes them to focus on optimizing those metrics while de-emphasizing other objectives—also applies to the training of agents. Using principle component analysis, we show how trainers in the two conditions train agents differently. In addition, by simulating the influence of the agent’s uncertainty–informative behavior on a human’s training behavior, we show that trainers could be distracted by the agent sharing its uncertainty levels about its actions, giving poor feedback for the sake of reducing the agent’s uncertainty without improving the agent’s performance
Recommended from our members
Decadal predictions of the cooling and freshening of the North Atlantic in the 1960s and the role of ocean circulation
In the 1960s North Atlantic sea surface temperatures (SST) cooled rapidly. The magnitude of the cooling was largest in the North Atlantic subpolar gyre (SPG), and was coincident with a rapid freshening of the SPG. Here we analyze hindcasts of the 1960s North Atlantic cooling made with the UK Met Office’s decadal prediction system (DePreSys), which is initialised using observations. It is shown that DePreSys captures—with a lead time of several years—the observed cooling and freshening of the North Atlantic SPG. DePreSys also captures changes in SST over the wider North Atlantic and surface climate impacts over the wider region, such as changes in atmospheric circulation in winter and sea ice extent. We show that initialisation of an anomalously weak Atlantic Meridional Overturning Circulation (AMOC), and hence weak northward heat transport, is crucial for DePreSys to predict the magnitude of the observed cooling. Such an anomalously weak AMOC is not captured when ocean observations are not assimilated (i.e. it is not a forced response in this model). The freshening of the SPG is also dominated by ocean salt transport changes in DePreSys; in particular, the simulation of advective freshwater anomalies analogous to the Great Salinity Anomaly were key. Therefore, DePreSys suggests that ocean dynamics played an important role in the cooling of the North Atlantic in the 1960s, and that this event was predictable
- …
