1,969 research outputs found

    Nutrient loss pathways from grazed grasslands and the effects of decreasing inputs: experimental results for three soil types

    Get PDF
    Agriculture is a main contributor of diffuse emissions of N and P to the environment. For N the main loss pathways are NH3-volatilization, leaching to ground and surface water and N-2(O) emissions. Currently, imposing restraints on farm inputs are used as policy tool to decrease N and P leaching to ground water and to surface water, and the same measure is suggested to combat emissions of N2O. The response, however, to these measures largely depends on the soil type. In this study nutrient flows of three dairy farms in The Netherlands with comparable intensity on sand, peat and clay soils were monitored for at least 2 years. The first aim was to provide quantitative data on current nutrient loss pathways. The second aim was to explore the responses in partitioning of the nutrient loss pathways when farm inputs were altered. Mean denitrification rates ranged from 103 kg N ha(-1) year(-1) for the sandy soil to 170 kg N ha(-1) year(-1) for the peat soil and leaching to surface water was about 73 kg N ha(-1) year(-1) for the sandy soil, 15 kg N ha(-1) year(-1) for the clay soil and 38 kg N ha(-1) year(-1) for the peat soil. For P, leaching to surface water ranged from 2 kg P ha(-1) year(-1) for the sandy site to 5 kg P ha(-1) year(-1) for the peat site. The sandy soil was most responsive to changes in N surpluses on leaching to surface water, followed by the peat soil and least responsive was the clay soil. For P, a similar sequence was found. This article demonstrates that similar reductions of N and P inputs result in different responses in N and P loss pathways for different soil types. These differences should be taken into account when evaluating measures to improve environmental performance of (dairy) farm

    Attenuated cerebrospinal fluid leukocyte count and sepsis in adults with pneumococcal meningitis: a prospective cohort study

    Get PDF
    BACKGROUND: A low cerebrospinal fluid (CSF) white-blood cell count (WBC) has been identified as an independent risk factor for adverse outcome in adults with bacterial meningitis. Whereas a low CSF WBC indicates the presence of sepsis with early meningitis in patients with meningococcal infections, the relation between CSF WBC and outcome in patients with pneumococcal meningitis is not understood. METHODS: We examined the relation between CSF WBC, bacteraemia and sepsis in a prospective cohort study that included 352 episodes of pneumococcal meningitis, confirmed by CSF culture, occurring in patients aged >16 years. RESULTS: CSF WBC was recorded in 320 of 352 episodes (91%). Median CSF WBC was 2530 per mm(3 )(interquartile range 531–6983 per mm(3)) and 104 patients (33%) had a CSF WBC <1000/mm(3). Patients with a CSF WBC <1000/mm(3 )were more likely to have an unfavourable outcome (defined as a Glasgow Outcome Scale score of 1–4) than those with a higher WBC (74 of 104 [71%] vs. 87 of 216 [43%]; P < 0.001). CSF WBC was significantly associated with blood WBC (Spearman's test 0.29), CSF protein level (0.20), thrombocyte count (0.21), erythrocyte sedimentation rate (-0.15), and C-reactive protein levels (-0.18). Patients with a CSF WBC <1000/mm(3 )more often had a positive blood culture (72 of 84 [86%] vs. 138 of 196 [70%]; P = 0.01) and more often developed systemic complications (cardiorespiratory failure, sepsis) than those with a higher WBC (53 of 104 [51%] vs. 69 of 216 [32%]; P = 0.001). In a multivariate analysis, advanced age (Odds ratio per 10-year increments 1.22, 95%CI 1.02–1.45), a positive blood culture (Odds ratio 2.46, 95%CI 1.17–5.14), and a low thrombocyte count on admission (Odds ratio per 100,000/mm(3 )increments 0.67, 95% CI 0.47–0.97) were associated with a CSF WBC <1000/mm(3). CONCLUSION: A low CSF WBC in adults with pneumococcal meningitis is related to the presence of signs of sepsis and systemic complications. Invasive pneumococcal infections should possibly be regarded as a continuum from meningitis to sepsis

    Method for coregistration of optical measurements of breast tissue with histopathology : the importance of accounting for tissue deformations

    Get PDF
    For the validation of optical diagnostic technologies, experimental results need to be benchmarked against the gold standard. Currently, the gold standard for tissue characterization is assessment of hematoxylin and eosin (H&E)-stained sections by a pathologist. When processing tissue into H&E sections, the shape of the tissue deforms with respect to the initial shape when it was optically measured. We demonstrate the importance of accounting for these tissue deformations when correlating optical measurement with routinely acquired histopathology. We propose a method to register the tissue in the H&E sections to the optical measurements, which corrects for these tissue deformations. We compare the registered H&E sections to H&E sections that were registered with an algorithm that does not account for tissue deformations by evaluating both the shape and the composition of the tissue and using microcomputer tomography data as an independent measure. The proposed method, which did account for tissue deformations, was more accurate than the method that did not account for tissue deformations. These results emphasize the need for a registration method that accounts for tissue deformations, such as the method presented in this study, which can aid in validating optical techniques for clinical use. (C) The Authors. Published by SPIE under a Creative Commons Attribution 4.0 Unported License

    Frequency of post-stroke pneumonia: Systematic review and meta-analysis of observational studies

    Get PDF
    Background: Post-stroke pneumonia and other infectious complications are serious conditions whose frequency varies widely across studies. Aims: We conducted a systematic review to estimate the frequency of post-stroke pneumonia and other types of major infection. Summary of review: MEDLINE, EMBASE, CINAHL, and PsycINFO databases were searched for prospective studies with consecutive recruitment of stroke patients. The primary outcome was post-stroke pneumonia. Secondary outcomes were any infection and urinary tract infection. Quality assessment was done using Newcastle Ottawa scale. Heterogeneity of estimates across study populations was calculated using Cochran's Q (heterogeneity χ2) and I2 statistics. A total of 47 studies (139,432 patients) with 48 sample populations were eligible for inclusion. Mean age of patients was 68.3 years and their mean National Institute of Health Stroke Scale score was 8.2. The pooled frequency of post-stroke pneumonia was 12.3% (95% confidence interval [CI] 11%–13.6%; I2 = 98%). The pooled frequency from 2011 to 2017 was 13.5% (95% CI 11.8%–15.3%; I2 = 98%) and comparable with earlier periods (P interaction = 0.31). The pooled frequency in studies in stroke units was 8% (95% CI 7.1%–9%; I2 = 78%) and significantly lower than other locations (P interaction = 0.001). The pooled frequency of post-stroke infection was 21% (95% CI 13%–29.3%; I2 = 99%) and of post-stroke urinary tract infection was 7.9% (95% CI 6.7%–9.3%; I2 = 96%). Conclusion: Approximately 1 in 10 stroke patients experience pneumonia during the acute period of hospital care. The frequency of post-stroke pneumonia has remained stable in recent decades but is lower in patients receiving stroke unit care compared to management in other ward settings

    Assessment of left ventricular ejection fraction in patients eligible for ICD therapy: Discrepancy between cardiac magnetic resonance imaging and 2D echocardiography

    Get PDF
    OBJECTIVE: Implantable cardioverter defibrillators (ICD) and cardiac resynchronisation therapy (CRT) have substantially improved the survival of patients with cardiomyopathy. Eligibility for this therapy requires a left ventricular ejection fraction (LVEF) <35 %. This is largely based on studies using echocardiography. Cardiac magnetic resonance imaging (CMR) is increasingly utilised for LVEF assessment, but several studies have shown differences between LVEF assessed by CMR and echocardiography. The present study compared LVEF assessment by CMR and echocardiography in a heart failure population and evaluated effects on eligibility for device therapy. METHODS: 152 patients (106 male, mean age 65.5 ± 9.9 years) referred for device therapy were included. During evaluation of eligibility they underwent both CMR and echocardiographic LVEF assessment. CMR volumes were computed from a stack of short-axis images. Echocardiographic volumes were computed using Simpson’s biplane method. RESULTS: The study population demonstrated an underestimation of end-diastolic volume (EDV) and end-systolic volume (ESV) by echocardiography of 71 ± 53 ml (mean ± SD) and 70 ± 49 ml, respectively. This resulted in an overestimation of LVEF of 6.6 ± 8.3 % by echocardiography compared with CMR (echocardiographic LVEF 31.5 ± 8.7 % and CMR LVEF 24.9 ± 9.6 %). 28 % of patients had opposing outcomes of eligibility for cardiac device therapy depending on the imaging modality used. CONCLUSION: We found EDV and ESV to be underestimated by echocardiography, and LVEF assessed by CMR to be significantly smaller than by echocardiography. Applying an LVEF cut-off value of 35 %, CMR would significantly increase the number of patients eligible for device implantation. Therefore, LVEF cut-off values might need reassessment when using CMR

    Validation of a Dutch Risk Score Predicting Poor Outcome in Adults with Bacterial Meningitis in Vietnam and Malawi

    Get PDF
    We have previously developed and validated a prognostic model to predict the risk for unfavorable outcome in Dutch adults with bacterial meningitis. The aim of the current study was to validate this model in adults with bacterial meningitis from two developing countries, Vietnam and Malawi. Demographic and clinical characteristics of Vietnamese (n = 426), Malawian patients (n = 465) differed substantially from those of Dutch patients (n = 696). The Dutch model underestimated the risk of poor outcome in both Malawi and Vietnam. The discrimination of the original model (c-statistic [c] 0.84; 95% confidence interval 0.81 to 0.86) fell considerably when re-estimated in the Vietnam cohort (c = 0.70) or in the Malawian cohort (c = 0.68). Our validation study shows that new prognostic models have to be developed for these countries in a sufficiently large series of unselected patients

    Temporal profile of pneumonia after stroke

    Get PDF
    The occurrence of pneumonia after stroke is associated with a higher risk of poor outcome or death. We assessed the temporal profile of pneumonia after stroke and its association with poor outcome at several time points to identify the most optimal period for testing pneumonia prevention strategies. METHODS: We analyzed individual patient data stored in the VISTA (Virtual International Stroke Trials Archive) from randomized acute stroke trials with an inclusion window up to 24 hours after stroke onset and assessed the occurrence of pneumonia in the first 90 days after stroke. Adjusted odds ratios and hazard ratios were calculated for the association between pneumonia and poor outcome and death by means of logistic and Cox proportional hazard regression, respectively, at different times of follow-up. RESULTS: Of 10 821 patients, 1017 (9.4%) had a total of 1076 pneumonias. Six hundred eighty-nine (64.0%) pneumonias occurred in the first week after stroke. The peak incidence was on the third day and the median time of onset was 4.0 days after stroke (interquartile range, 2–12). The presence of a pneumonia was associated with an increased risk of poor outcome (adjusted odds ratio, 4.8 [95% CI, 3.8–6.1]) or death (adjusted hazard ratio, 4.1 [95% CI, 3.7–4.6]). These associations were present throughout the 90 days of follow-up. CONCLUSIONS: Two out of 3 pneumonias in the first 3 months after stroke occur in the first week, with a peak incidence on the third day. The most optimal period to assess pneumonia prevention strategies is the first 4 days after stroke. However, pneumonia occurring later was also associated with poor functional outcome or death

    An adaptive, real-time cadence algorithm for unconstrained sensor placement

    Get PDF
    This paper evaluates a new and adaptive real-time cadence detection algorithm (CDA) for unconstrained sensor placement during walking and running. Conventional correlation procedures, dependent on sensor position and orientation, may alternately detect either steps or strides and consequently suffer from false negatives or positives. To overcome this limitation, the CDA validates correlation peaks as strides using the Sylvester's criterion (SC). This paper compares the CDA with conventional correlation methods.22 volunteers completed 7 different circuits (approx. 140 m) at three gaits-speeds: walking (1.5 m s- 1), running (3.4 m s- 1), and sprinting (5.2 and 5.7 m s- 1), disturbed by various gait-related activities. The algorithm was simultaneously evaluated for 10 different sensor positions. Reference strides were obtained from a foot sensor using a dedicated offline algorithm.The described algorithm resulted in consistent numbers of true positives (85.6-100.0%) and false positives (0.0-2.9%) and showed to be consistently accurate for cadence feedback across all circuits, subjects and sensors (mean ± SD: 98.9 ± 0.2%), compared to conventional cross-correlation (87.3 ± 13.5%), biased (73.0 ± 16.2) and unbiased (82.2 ± 20.6) autocorrelation procedures.This study shows that the SC significantly improves cadence detection, resulting in robust results for various gaits, subjects and sensor positions
    • …
    corecore