2,716 research outputs found

    Impact of herbivores on nitrogen cycling:contrasting effects of small and large species

    Get PDF
    Herbivores are reported to slow down as well as enhance nutrient cycling in grasslands. These conflicting results may be explained by differences in herbivore type. In this study we focus on herbivore body size as a factor that causes differences in herbivore effects on N cycling. We used an exclosure set-up in a floodplain grassland grazed by cattle, rabbits and common voles, where we subsequently excluded cattle and rabbits. Exclusion of cattle lead to an increase in vole numbers and a 1.5-fold increase in net annual N mineralization at similar herbivore densities (corrected to metabolic weight). Timing and height of the mineralization peak in spring was the same in all treatments, but mineralization in the vole-grazed treatment showed a peak in autumn, when mineralization had already declined under cattle grazing. This mineralization peak in autumn coincides with a peak in vole density and high levels of N input through vole faeces at a fine-scale distribution, whereas under cattle grazing only a few patches receive all N and most experience net nutrient removal. The other parameters that we measured, which include potential N mineralization rates measured under standardized laboratory conditions and soil parameters, plant biomass and plant nutrient content measured in the field, were the same for all three grazing treatments and could therefore not cause the observed difference. When cows were excluded, more litter accumulated in the vegetation. The formation of this litter layer may have added to the higher mineralization rates under vole grazing, through enhanced nutrient return through litter or through modification of microclimate. We conclude that different-sized herbivores have different effects on N cycling within the same habitat. Exclusion of large herbivores resulted in increased N annual mineralization under small herbivore grazin

    Gaussian-process-based demand forecasting for predictive control of drinking water networks

    Get PDF
    Trabajo presentado a la 9th International Conference on Critical Information Infrastructures Security, celebrada en Limassol (Chipre) del 13 al 15 de octubre de 2014.This paper focuses on short-term water demand forecasting for predictive control of DrinkingWater Networks (DWN) by using Gaussian Process (GP). For the predictive control strategy, system state prediction in a nite horizon are generated by a DWN model and demands are regarded as system disturbances. The goal is to provide a demand estimation within a given condence interval. For the sake of obtaining a desired forecasting performance, the forecasting process is carried out in two parts: the expected part is forecasted by Double-Seasonal Holt-Winters (DSHW) method and the stochastic part is forecasted by GP method. The mean value of water demand is rstly estimated by DSHW while GP provides estimations within a condence interval. GP is applied with random inputs to propagate uncertainty at each step. Results of the application of the proposed approach to a real case study based on the Barcelona DWN have shown that the general goal has been successfully reached.This work is partially supported by the research projects SHERECS DPI-2011-26243 and ECOCIS DPI-2013-48243-C2-1-R, both of the Spanish Ministry of Education, by EFFINET grant FP7-ICT-2012-318556 of the European Commission and by AGAUR Doctorat Industrial 2013-DI-041. Ye Wang also thanks China Scholarship Council for providing postgraduate scholarship.Peer Reviewe

    Changing patterns of home visiting in general practice: an analysis of electronic medical records

    Get PDF
    BACKGROUND: In most European countries and North America the number of home visits carried out by GPs has been decreasing sharply. This has been influenced by non-medical factors such as mobility and pressures on time. The objective of this study was to investigate changes in home visiting rates, looking at the level of diagnoses in1987 and in 2001. METHODS: We analysed routinely collected data on diagnoses in home visits and surgery consultations from electronic medical records by general practitioners. Data were used from 246,738 contacts among 124,791 patients in 103 practices in 1987, and 77,167 contacts among 58,345 patients in 80 practices in 2001. There were 246 diagnoses used. The main outcome measure was the proportion of home visits per diagnosis in 2001. RESULTS: Within the period studied, the proportion of home visits decreased strongly. The size of this decrease varied across diagnoses. The relation between the proportion of home visits for a diagnosis in 1987 and the same proportion in 2001 is curvilinear (J-shaped), indicating that the decrease is weaker at the extreme points and stronger in the middle. CONCLUSION: By comparison with 1987, the proportion of home visits shows a distinct decline. However, the results show that this decline is not necessarily a problem. The finding that this decline varied mainly between diagnoses for which home visits are not always urgent, shows that medical considerations still play an important role in the decision about whether or not to carry out a home visit

    Effectiveness of a school-based physical activity-related injury prevention program on risk behavior and neuromotor fitness a cluster randomized controlled trial

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>To investigate the effects of a school-based physical activity-related injury prevention program, called 'iPlay', on risk behavior and neuromotor fitness.</p> <p>Methods</p> <p>In this cluster randomized controlled trial 40 primary schools throughout the Netherlands were randomly assigned in an intervention (n = 20) or control group (n = 20). The study includes 2,210 children aged 10-12 years.</p> <p>The iPlay-intervention takes one school year and consists of a teacher manual, informative newsletters and posters, a website, and simple exercises to be carried out during physical education classes.</p> <p>Outcomes measures were self-reported injury preventing behavior, self-reported behavioral determinants (knowledge, attitude, social-influence, self-efficacy, and intention), and neuromotor fitness.</p> <p>Results</p> <p>The iPlay-program was not able to significantly improve injury-preventing behavior. The program did significantly improve knowledge and attitude, two determinants of behavior. The effect of the intervention-program on behavior appeared to be significantly mediated by knowledge and attitude. Improved scores on attitude, social norm, self-efficacy and intention were significantly related to changes in injury preventing behavior. Furthermore, iPlay resulted in small non-significant improvements in neuromotor fitness in favor of the intervention group.</p> <p>Conclusion</p> <p>This cluster randomized controlled trial showed that the iPlay-program did significantly improved behavioral determinants. However, this effect on knowledge and attitude was not strong enough to improve injury preventing behavior. Furthermore, the results confirm the hypothetical model that injury preventing behavior is determined by intention, attitude, social norm and self-efficacy.</p> <p>Trial number</p> <p>ISRCTN78846684</p

    The prognostic value of blood lactate levels relative to that of vital signs in the pre-hospital setting: a pilot study

    Get PDF
    Introduction: A limitation of pre-hospital monitoring is that vital signs often do not change until a patient is in a critical stage. Blood lactate levels are suggested as a more sensitive parameter to evaluate a patient's condition. The aim of this pilot study was to find presumptive evidence for a relation between pre-hospital lactate levels and in-hospital mortality, corrected for vital sign abnormalities. Methods: In this prospective observational study (n = 124), patients who required urgent ambulance dispatching and had a systolic blood pressure below 100 mmHg, a respiratory rate less than 10 or more than 29 breaths/ minute, or a Glasgow Coma Scale (GCS) below 14 were enrolled. Nurses from Emergency Medical Services measured capillary or venous lactate levels using a hand-held device on arrival at the scene (T1) and just before or on arrival at the emergency department (T2). The primary outcome measured was in-hospital mortality. Results: The average (standard deviation) time from T1 to T2 was 27 (10) minutes. Non-survivors (n = 32, 26%) had significantly higher lactate levels than survivors at T1 (5.3 vs 3.7 mmol/L) and at T2 (5.4 vs 3.2 mmol/L). Mortality was significantly higher in patients with lactate levels of 3.5 mmol/L or higher compared with those with lactate levels below 3.5 mmol/L (T1: 41 vs 12% and T2: 47 vs 15%). Also in the absence of hypotension, mortality was higher in those with higher lactate levels. In a multivariable Cox proportional hazard analysis including systolic blood pressure, heart rate, GCS (all at T1) and delta lactate level (from T1 to T2), only delta lactate level (hazard ratio (HR) = 0.20, 95% confidence interval (CI) = 0.05 to 0.76, p = 0.018) and GCS (HR = 0.93, 95% CI = 0.88 to 0.99, p = 0.022) were significant independent predictors of in-hospital mortality. Conclusions: In a cohort of patients that required urgent ambulance dispatching, pre-hospital blood lactate levels were associated with in-hospital mortality and provided prognostic information superior to that provided by the patient's vital signs. There is potential for early detection of occult shock and pre-hospital resuscitation guided by lactate measurement. However, external validation is required before widespread implementation of lactate measurement in the out-of-hospital setting

    Antibody engineering & therapeutics, the annual meeting of the antibody society December 7-10, 2015, San Diego, CA, USA

    Get PDF
    The 26th Antibody Engineering & Therapeutics meeting, the annual meeting of The Antibody Society united over 800 participants from all over the world in San Diego from 6-10 December 2015. The latest innovations and advances in antibody research and development were discussed, covering a myriad of antibody-related topics by more than 100 speakers, who were carefully selected by The Antibody Society. As a prelude, attendees could join the pre-conference training course focusing, among others, on the engineering and enhancement of antibodies and antibody-like scaffolds, bispecific antibody engineering and adaptation to generate chimeric antigen receptor constructs. The main event covered 4 d of scientific sessions that included antibody effector functions, reproducibility of research and diagnostic antibodies, new developments in antibody-drug conjugates (ADCs), preclinical and clinical ADC data, new technologies and applications for bispecific antibodies, antibody therapeutics for non-cancer and orphan indications, antibodies to harness the cellular immune system, building comprehensive IgVH-gene repertoires through discovering, confirming and cataloging new germline IgVH genes, and overcoming resistance to clinical immunotherapy. The Antibody Society's special session focused on "Antibodies to watch" in 2016. Another special session put the spotlight on the limitations of the new definitions for the assignment of antibody international nonproprietary names introduced by the World Health Organization. The convention concluded with workshops on computational antibody design and on the promise and challenges of using next-generation sequencing for antibody discovery and engineering from synthetic and in vivo libraries

    Non-nucleoside reverse transcriptase inhibitor-based combination antiretroviral therapy is associated with lower cell-associated hiv rna and dna levels as compared with therapy based on protease inhibitors

    Get PDF
    BACKGROUND: It remains unclear whether combination antiretroviral therapy (ART) regimens differ in their ability to fully suppress HIV replication. Here, we report the results of two crosssectional studies that compared levels of cell-associated (CA) HIV markers between individuals receiving suppressive ART containing either a non-nucleoside reverse transcriptase inhibitor (NNRTI) or a protease inhibitor (PI). METHODS: CA HIV unspliced RNA and total HIV DNA were quantified in two cohorts (n=100, n=124) of individuals treated with triple ART regimens consisting of two nucleoside reverse transcriptase inhibitors (NRTIs) plus either a NNRTI or a PI. To compare CA HIV RNA and DNA levels between the regimens, we built multivariable models adjusting for age, gender, current and nadir CD4+ count, plasma viral load zenith, duration of virological suppression, NRTI backbone composition, low-level plasma HIV RNA detectability, and electronically-measured adherence to ART. RESULTS: In both cohorts, levels of CA HIV RNA and DNA strongly correlated (rho=0.70 and rho=0.54) and both markers were lower in NNRTI-treated than in PI-treated individuals. In the multivariable analysis, CA RNA in both cohorts remained significantly reduced in NNRTI-treated individuals (padj=0.02 in both cohorts), with a similar but weaker association between the ART regimen and total HIV DNA (padj=0.048 and padj=0.10). No differences in CA HIV RNA or DNA levels were observed between individual NNRTIs or individual PIs, but CA HIV RNA was lower in individuals treated with either nevirapine or efavirenz, compared to PI-treated individuals CONCLUSIONS: All current classes of antiretroviral drugs only prevent infection of new cells but do not inhibit HIV RNA transcription in long-lived reservoir cells. Therefore, these differences in CA HIV RNA and DNA levels by treatment regimen suggest that NNRTIs are more potent in suppressing HIV residual replication than PIs, which may result in a smaller viral reservoir size

    Unmappable ventricular tachycardia after an old myocardial infarction. Long-term results of substrate modification in patients with an implantable cardioverter defibrillator

    Get PDF
    Purpose The frequent occurrence of ventricular tachycardia can create a serious problem in patients with an implantable cardioverter defibrillator. We assessed the long-term efficacy of catheter-based substrate modification using the voltage mapping technique of infarct-related ventricular tachycardia and recurrent device therapy. Methods The study population consisted of 27 consecutive patients (age 68 +/- 8 years, 25 men, mean left ventricular ejection fraction 31 +/- 9%) with an old myocardial infarction and multiple and/or hemodynamically not tolerated ventricular tachycardia necessitating repeated device therapy. A total of 31 substrate modification procedures were performed using the three-dimensional electroanatomical mapping system. Patients were followed up for a median of 23.5 (interquartile range 6.5-53.2) months before and 37.8 (interquartile range 11.7-71.8) months after ablation. Antiarrhythmic drugs were not changed after the procedure, and were stopped 6 to 9 months after the procedure in patients who did not show ventricular tachycardia recurrence. Results Median ventricular tachycardias were 1.6 (interquartile range 0.7-6.7) per month before and 0.2 (interquartile range 0.00-1.3) per month after ablation (P = 0.006). Nine ventricular fibrillation episodes were registered in seven patients before and two after ablation (P = 0.025). Median antitachycardia pacing decreased from 1.6 (interquartile range 0.01-5.5) per month before to 0.18 (interquartile range 0.00-1.6) per month after ablation (P = 0.069). Median number of shocks decreased from 0.19 (interquartile range 0.04-0.81) per month before to 0.00 (interquartile range 0.00-0.09) per month after ablation (P = 0.001). One patient had a transient ischemic attack during the procedure, and another developed pericarditis. Nine patients died during follow-up, eight patients due to heart failure and one patient during valve surgery. Conclusion Catheter-based substrate modification using voltage mapping results in a long-lasting reduction of cardioverter defibrillator therapy in patients with multiple and/or hemodynamically not tolerated infarct-related ventricular tachyarrhythmia
    corecore