376 research outputs found

    Review: Soil compaction and controlled traffic farming in arable and grass cropping systems

    Get PDF
    There is both circumstantial and direct evidence which demonstrates the significant productivity and sustainability benefits associated with adoption of controlled traffic farming (CTF). These benefits may be fully realised when CTF is jointly practiced with no-tillage and assisted by the range of precision agriculture (PA) technologies available. Important contributing factors are those associated with improved trafficability and timeliness of field operations. Adoption of CTF is therefore encouraged as a technically and economically viable option to improve productivity and resource-use efficiency in arable and grass cropping systems. Studies on the economics of CTF consistently show that it is a profitable technological innovation for both grassland and arable land-use. Despite these benefits, global adoption of CTF is still relatively low, with the exception of Australia where approximately 30% of the grain production systems are managed under CTF. The main barriers for adoption of CTF have been equipment incompatibilities and the need to modify machinery to suit a specific system design, often at the own farmers’ risk of loss of product warranty. Other barriers include reliance on contracting operations, land tenure systems, and road transport regulations. However, some of the barriers to adoption can be overcome with forward planning when conversion to CTF is built into the machinery replacement programme, and organisations such as ACTFA in Australia and CTF Europe Ltd. in Central and Northern Europe have developed suitable schemes to assist farmers in such a process

    Self-monitoring of blood pressure in hypertension: A systematic review and individual patient data meta-analysis.

    Get PDF
    BACKGROUND: Self-monitoring of blood pressure (BP) appears to reduce BP in hypertension but important questions remain regarding effective implementation and which groups may benefit most. This individual patient data (IPD) meta-analysis was performed to better understand the effectiveness of BP self-monitoring to lower BP and control hypertension. METHODS AND FINDINGS: Medline, Embase, and the Cochrane Library were searched for randomised trials comparing self-monitoring to no self-monitoring in hypertensive patients (June 2016). Two reviewers independently assessed articles for eligibility and the authors of eligible trials were approached requesting IPD. Of 2,846 articles in the initial search, 36 were eligible. IPD were provided from 25 trials, including 1 unpublished study. Data for the primary outcomes-change in mean clinic or ambulatory BP and proportion controlled below target at 12 months-were available from 15/19 possible studies (7,138/8,292 [86%] of randomised participants). Overall, self-monitoring was associated with reduced clinic systolic blood pressure (sBP) compared to usual care at 12 months (-3.2 mmHg, [95% CI -4.9, -1.6 mmHg]). However, this effect was strongly influenced by the intensity of co-intervention ranging from no effect with self-monitoring alone (-1.0 mmHg [-3.3, 1.2]), to a 6.1 mmHg (-9.0, -3.2) reduction when monitoring was combined with intensive support. Self-monitoring was most effective in those with fewer antihypertensive medications and higher baseline sBP up to 170 mmHg. No differences in efficacy were seen by sex or by most comorbidities. Ambulatory BP data at 12 months were available from 4 trials (1,478 patients), which assessed self-monitoring with little or no co-intervention. There was no association between self-monitoring and either lower clinic or ambulatory sBP in this group (clinic -0.2 mmHg [-2.2, 1.8]; ambulatory 1.1 mmHg [-0.3, 2.5]). Results for diastolic blood pressure (dBP) were similar. The main limitation of this work was that significant heterogeneity remained. This was at least in part due to different inclusion criteria, self-monitoring regimes, and target BPs in included studies. CONCLUSIONS: Self-monitoring alone is not associated with lower BP or better control, but in conjunction with co-interventions (including systematic medication titration by doctors, pharmacists, or patients; education; or lifestyle counselling) leads to clinically significant BP reduction which persists for at least 12 months. The implementation of self-monitoring in hypertension should be accompanied by such co-interventions

    Predicting Out-of-Office Blood Pressure in the Clinic for the Diagnosis of Hypertension in Primary Care An Economic Evaluation

    Get PDF
    Clinical guidelines in the United States and United Kingdom recommend that individuals with suspected hypertension should have ambulatory blood pressure (BP) monitoring to confirm the diagnosis. This approach reduces misdiagnosis because of white coat hypertension but will not identify people with masked hypertension who may benefit from treatment. The Predicting Out-of-Office Blood Pressure (PROOF-BP) algorithm predicts masked and white coat hypertension based on patient characteristics and clinic BP, improving the accuracy of diagnosis while limiting subsequent ambulatory BP monitoring. This study assessed the cost-effectiveness of using this tool in diagnosing hypertension in primary care. A Markov cost–utility cohort model was developed to compare diagnostic strategies: the PROOF-BP approach, including those with clinic BP ≥130/80 mm Hg who receive ambulatory BP monitoring as guided by the algorithm, compared with current standard diagnostic strategies including those with clinic BP ≥140/90 mm Hg combined with further monitoring (ambulatory BP monitoring as reference, clinic, and home monitoring also assessed). The model adopted a lifetime horizon with a 3-month time cycle, taking a UK Health Service/Personal Social Services perspective. The PROOF-BP algorithm was cost-effective in screening all patients with clinic BP ≥130/80 mm Hg compared with current strategies that only screen those with clinic BP ≥140/90 mm Hg, provided healthcare providers were willing to pay up to £20 000 ($26 000)/quality-adjusted life year gained. Deterministic and probabilistic sensitivity analyses supported the base-case findings. The PROOF-BP algorithm seems to be cost-effective compared with the conventional BP diagnostic options in primary care. Its use in clinical practice is likely to lead to reduced cardiovascular disease, death, and disability

    Quantifying soil hydrology to explain the development of vegetation at an ex-arable wetland restoration site

    Get PDF
    Wetland restoration frequently sets well-defined vegetation targets, but where restoration occurs on highly degraded land such targets are not practical and setting looser targets may be more appropriate. Where this more ‘open-ended’ approach to restoration is adopted, surveillance methods that can track developing wetland habitats need to be established. Water regime and soil structure are known to influence the distribution and composition of developing wetland vegetation, and may be quantified using Sum Exceedence Values (SEV), calculated using the position of the water table and knowledge of soil stress thresholds. Use of SEV to explain patterns in naturally colonizing vegetation on restored, ex-arable land was tested at Wicken Fen (UK). Analysis of values from ten locations showed that soil structure was highly heterogeneous. Five locations had shallow aeration stress thresholds and so had the potential to support diverse wetland assemblages. Deep aeration stress thresholds at other locations precluded the establishment of a diverse wetland flora, but identified areas where species-poor wetland assemblages may develop. SEV was found to be a useful tool for the surveillance of sites where restoration targets are not specified in detail at the outset and may help predict likely habitat outcomes at sites using an open-ended restoration approach

    Global, regional, and national comparative risk assessment of 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks, 1990-2015: a systematic analysis for the Global Burden of Disease Study 2015

    Get PDF
    SummaryBackground The Global Burden of Diseases, Injuries, and Risk Factors Study 2015 provides an up-to-date synthesis of the evidence for risk factor exposure and the attributable burden of disease. By providing national and subnational assessments spanning the past 25 years, this study can inform debates on the importance of addressing risks in context. Methods We used the comparative risk assessment framework developed for previous iterations of the Global Burden of Disease Study to estimate attributable deaths, disability-adjusted life-years (DALYs), and trends in exposure by age group, sex, year, and geography for 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks from 1990 to 2015. This study included 388 risk-outcome pairs that met World Cancer Research Fund-defined criteria for convincing or probable evidence. We extracted relative risk and exposure estimates from randomised controlled trials, cohorts, pooled cohorts, household surveys, census data, satellite data, and other sources. We used statistical models to pool data, adjust for bias, and incorporate covariates. We developed a metric that allows comparisons of exposure across risk factors—the summary exposure value. Using the counterfactual scenario of theoretical minimum risk level, we estimated the portion of deaths and DALYs that could be attributed to a given risk. We decomposed trends in attributable burden into contributions from population growth, population age structure, risk exposure, and risk-deleted cause-specific DALY rates. We characterised risk exposure in relation to a Socio-demographic Index (SDI). Findings Between 1990 and 2015, global exposure to unsafe sanitation, household air pollution, childhood underweight, childhood stunting, and smoking each decreased by more than 25%. Global exposure for several occupational risks, high body-mass index (BMI), and drug use increased by more than 25% over the same period. All risks jointly evaluated in 2015 accounted for 57·8% (95% CI 56·6–58·8) of global deaths and 41·2% (39·8–42·8) of DALYs. In 2015, the ten largest contributors to global DALYs among Level 3 risks were high systolic blood pressure (211·8 million [192·7 million to 231·1 million] global DALYs), smoking (148·6 million [134·2 million to 163·1 million]), high fasting plasma glucose (143·1 million [125·1 million to 163·5 million]), high BMI (120·1 million [83·8 million to 158·4 million]), childhood undernutrition (113·3 million [103·9 million to 123·4 million]), ambient particulate matter (103·1 million [90·8 million to 115·1 million]), high total cholesterol (88·7 million [74·6 million to 105·7 million]), household air pollution (85·6 million [66·7 million to 106·1 million]), alcohol use (85·0 million [77·2 million to 93·0 million]), and diets high in sodium (83·0 million [49·3 million to 127·5 million]). From 1990 to 2015, attributable DALYs declined for micronutrient deficiencies, childhood undernutrition, unsafe sanitation and water, and household air pollution; reductions in risk-deleted DALY rates rather than reductions in exposure drove these declines. Rising exposure contributed to notable increases in attributable DALYs from high BMI, high fasting plasma glucose, occupational carcinogens, and drug use. Environmental risks and childhood undernutrition declined steadily with SDI; low physical activity, high BMI, and high fasting plasma glucose increased with SDI. In 119 countries, metabolic risks, such as high BMI and fasting plasma glucose, contributed the most attributable DALYs in 2015. Regionally, smoking still ranked among the leading five risk factors for attributable DALYs in 109 countries; childhood underweight and unsafe sex remained primary drivers of early death and disability in much of sub-Saharan Africa. Interpretation Declines in some key environmental risks have contributed to declines in critical infectious diseases. Some risks appear to be invariant to SDI. Increasing risks, including high BMI, high fasting plasma glucose, drug use, and some occupational exposures, contribute to rising burden from some conditions, but also provide opportunities for intervention. Some highly preventable risks, such as smoking, remain major causes of attributable DALYs, even as exposure is declining. Public policy makers need to pay attention to the risks that are increasingly major contributors to global burden. Funding Bill & Melinda Gates Foundation

    Increased Mobility of Metal Oxide Nanoparticles Due to Photo and Thermal Induced Disagglomeration

    Get PDF
    Significant advances have been made on our understanding of the fate and transport of engineered nanomaterials. One unexplored aspect of nanoparticle aggregation is how environmental stimuli such as light exposure and temperature variations affect the mobility of engineered nanoparticles. In this study, TiO2, ZnO, and CeO2 were chosen as model materials for investigating the mobility of nanoparticles under three external stimuli: heat, light and sonication. Sunlight and high power sonication were able to partially disagglomerate metal oxide clusters, but primary particles bonded by solid state necks were left intact. A cycle of temperature increase from 25°C to 65°C and then decrease back was found to disagglomerate the compact clusters in the heating phase and reagglomerate them as more open fractal structures during the cooling phase. A fractal model summing the pair-wise DLVO interactions between primary particles within two fractal agglomerates predicts weak attractions on the order of a few kT. Our study shows that common environmental stimuli such as light exposure or temperature variation can disagglomerate nanoparticle clusters and enhance their mobility in open waters. This phenomenon warrants attention since it is likely that metal oxide nanoparticles will experience these natural stimuli during their transport in the environment

    Molecular epidemiology of pneumococci obtained from Gambian children aged 2–29 months with invasive pneumococcal disease during a trial of a 9-valent pneumococcal conjugate vaccine

    Get PDF
    BACKGROUND: The study describes the molecular epidemiology of Streptococcus pneumoniae causing invasive disease in Gambian children METHODS: One hundred and thirty-two S. pneumoniae isolates were recovered from children aged 2-29 months during the course of a pneumococcal conjugate vaccine trial conducted in The Gambia of which 131 were characterized by serotyping, antibiotic susceptibility, BOX-PCR and MLST. RESULTS: Twenty-nine different serotypes were identified; serotypes 14, 19A, 12F, 5, 23F, and 1 were common and accounted for 58.3% of all serotypes overall. MLST analysis showed 72 sequence types (STs) of which 46 are novel. eBURST analysis using the stringent 6/7 identical loci definition, grouped the isolates into 17 clonal complexes and 32 singletons. The population structure of the 8 serotype 1 isolates obtained from 4 vaccinated and 2 unvaccinated children were the same (ST 618) except that one (ST3336) of the isolates from an unvaccinated child had a novel ST which is a single locus variant of ST 618. CONCLUSION: We provide the first background data on the genetic structure of S. pneumoniae causing IPD prior to PC7V use in The Gambia. This data will be important for assessing the impact of PC7V in post-vaccine surveillance from The Gambia
    corecore