33 research outputs found

    Zur Interaktion von Bodenfruchtbarkeitsmangement und sortenspezifischen Merkmalen auf den Befall mit Phytophthora infestans bei Kartoffeln

    Get PDF
    Der Einfluß von Infektionen mit Phytophthora infestans auf den Ertrag und die Qualität des ökologischen Kartoffelanbaus ist sehr variabel. Die Verluste reichen von geringen Effekten bis zu vollständigen Ertragsverlusten. Der Einfluß der Phytophthora-Infektion auf den Ertrag ist u. U. bei der Wahl der „richtigen Sorte“ von geringerer Bedeutung, wenn durch eine systemimmanente Stickstoffakkumulation und –angebot die Kartoffeln in der Knollenbildungsphase Mitte Juni - Anf. Juli gut mit Stickstoff versorgt sind, um eine hohe Knollenbildung und –ausbildung zu gewährleisten. Im Rahmen eines EU-Projektes werden in 2-jährigen Feldversuchen (2002/2003) an zwei Standorten (Deutschland und Niederlande) Untersuchungen zum Effekt der Fruchtfolgestellung von Kartoffeln verschiedenen Kollenansatztyps mit Kleegras als Vorfrucht oder Vorvorfrucht(Vorfrucht Weizen) in Wechselwirkung zum Befall mit P. infestans vorgenommen sowie in Großbritannien umfangreiche Düngeversuche angelegt. Ziel ist es, die Auswirkungen von verschiedenen Nährstoffregimen vor dem Hintergrund des Wegfalls des Kupfereinsatzes gegen P. infestans abzuschätzen

    Validation of the Body Concealment Scale for Scleroderma (BCSS): Replication in the Scleroderma Patient-centered Intervention Network (SPIN) Cohort

    Get PDF
    © 2016 Elsevier Ltd Body concealment is an important component of appearance distress for individuals with disfiguring conditions, including scleroderma. The objective was to replicate the validation study of the Body Concealment Scale for Scleroderma (BCSS) among 897 scleroderma patients. The factor structure of the BCSS was evaluated using confirmatory factor analysis and the Multiple-Indicator Multiple-Cause model examined differential item functioning of SWAP items for sex and age. Internal consistency reliability was assessed via Cronbach's alpha. Construct validity was assessed by comparing the BCSS with a measure of body image distress and measures of mental health and pain intensity. Results replicated the original validation study, where a bifactor model provided the best fit. The BCSS demonstrated strong internal consistency reliability and construct validity. Findings further support the BCSS as a valid measure of body concealment in scleroderma and provide new evidence that scores can be compared and combined across sexes and ages

    Outcome in patients perceived as receiving excessive care across different ethical climates: a prospective study in 68 intensive care units in Europe and the USA

    Get PDF
    Purpose: Whether the quality of the ethical climate in the intensive care unit (ICU) improves the identification of patients receiving excessive care and affects patient outcomes is unknown. Methods: In this prospective observational study, perceptions of excessive care (PECs) by clinicians working in 68 ICUs in Europe and the USA were collected daily during a 28-day period. The quality of the ethical climate in the ICUs was assessed via a validated questionnaire. We compared the combined endpoint (death, not at home or poor quality of life at 1 year) of patients with PECs and the time from PECs until written treatment-limitation decisions (TLDs) and death across the four climates defined via cluster analysis. Results: Of the 4747 eligible clinicians, 2992 (63%) evaluated the ethical climate in their ICU. Of the 321 and 623 patients not admitted for monitoring only in ICUs with a good (n = 12, 18%) and poor (n = 24, 35%) climate, 36 (11%) and 74 (12%), respectively were identified with PECs by at least two clinicians. Of the 35 and 71 identified patients with an available combined endpoint, 100% (95% CI 90.0–1.00) and 85.9% (75.4–92.0) (P = 0.02) attained that endpoint. The risk of death (HR 1.88, 95% CI 1.20–2.92) or receiving a written TLD (HR 2.32, CI 1.11–4.85) in patients with PECs by at least two clinicians was higher in ICUs with a good climate than in those with a poor one. The differences between ICUs with an average climate, with (n = 12, 18%) or without (n = 20, 29%) nursing involvement at the end of life, and ICUs with a poor climate were less obvious but still in favour of the former. Conclusion: Enhancing the quality of the ethical climate in the ICU may improve both the identification of patients receiving excessive care and the decision-making process at the end of life

    Hyperoxemia and excess oxygen use in early acute respiratory distress syndrome : Insights from the LUNG SAFE study

    Get PDF
    Publisher Copyright: © 2020 The Author(s). Copyright: Copyright 2020 Elsevier B.V., All rights reserved.Background: Concerns exist regarding the prevalence and impact of unnecessary oxygen use in patients with acute respiratory distress syndrome (ARDS). We examined this issue in patients with ARDS enrolled in the Large observational study to UNderstand the Global impact of Severe Acute respiratory FailurE (LUNG SAFE) study. Methods: In this secondary analysis of the LUNG SAFE study, we wished to determine the prevalence and the outcomes associated with hyperoxemia on day 1, sustained hyperoxemia, and excessive oxygen use in patients with early ARDS. Patients who fulfilled criteria of ARDS on day 1 and day 2 of acute hypoxemic respiratory failure were categorized based on the presence of hyperoxemia (PaO2 > 100 mmHg) on day 1, sustained (i.e., present on day 1 and day 2) hyperoxemia, or excessive oxygen use (FIO2 ≥ 0.60 during hyperoxemia). Results: Of 2005 patients that met the inclusion criteria, 131 (6.5%) were hypoxemic (PaO2 < 55 mmHg), 607 (30%) had hyperoxemia on day 1, and 250 (12%) had sustained hyperoxemia. Excess FIO2 use occurred in 400 (66%) out of 607 patients with hyperoxemia. Excess FIO2 use decreased from day 1 to day 2 of ARDS, with most hyperoxemic patients on day 2 receiving relatively low FIO2. Multivariate analyses found no independent relationship between day 1 hyperoxemia, sustained hyperoxemia, or excess FIO2 use and adverse clinical outcomes. Mortality was 42% in patients with excess FIO2 use, compared to 39% in a propensity-matched sample of normoxemic (PaO2 55-100 mmHg) patients (P = 0.47). Conclusions: Hyperoxemia and excess oxygen use are both prevalent in early ARDS but are most often non-sustained. No relationship was found between hyperoxemia or excessive oxygen use and patient outcome in this cohort. Trial registration: LUNG-SAFE is registered with ClinicalTrials.gov, NCT02010073publishersversionPeer reviewe

    Evaluation of How Anesthesia Affect Body Temperature in Sows Using Infrared Thermography

    No full text
    The objective of this experiment was to determine the relationship between rectal temperature and infrared temperature measured on the inner eye, center- and ear base of sows undergoing anesthesia. A total of six sows were used. Sows were anaesthetized using a combination of xylazine, tiletamine HCl and ketamine. Thermal images at the inner ear, ear center and ear base were taken at 10 minute intervals starting ten minutes’ post-anesthetic induction until the sow was able to stand or reached 91.7◦ F body temperature. Rectal temperatures were measured using a digital thermometer. Rectal temperature Pearson correlations were determined among the inner eye, center and ear base with a significance level set at P ≤ 0.05. Percent variation accounted for by these locations was calculated as the correlation coefficient (r) raised to the second power and multiplied by 100 (r2 x 100). There was a positive correlation between rectal and inner eye, ear center and base (P ≤ 0.03). The lowest correlation was between the ear base and accounted for 9% of the sows’ rectal temperature variation. The correlation was the greatest for the inner eye and accounted for 38% of the sow’s rectal temperature variation. In conclusion, thermal images of the inner eye provided an effective and less invasive approach to rectal temperature for sows undergoing anesthesia.</p

    Autopsy Study Defines Composition and Dynamics of the HIV-1 Reservoir after Allogeneic Hematopoietic Stem Cell Transplantation with CCR5&Delta;32/&Delta;32 Donor Cells

    Get PDF
    Allo-HSCT with CCR5&Delta;32/&Delta;32 donor cells is the only curative HIV-1 intervention. We investigated the impact of allo-HSCT on the viral reservoir in PBMCs and post-mortem tissue in two patients. IciS-05 and IciS-11 both received a CCR5&Delta;32/&Delta;32 allo-HSCT. Before allo-HSCT, ultrasensitive HIV-1 RNA quantification; HIV-1-DNA quantification; co-receptor tropism analysis; deep-sequencing and viral characterization in PBMCs and bone marrow; and post-allo-HSCT, ultrasensitive RNA and HIV-1-DNA quantification were performed. Proviral quantification, deep sequencing, and viral characterization were done in post-mortem tissue samples. Both patients harbored subtype B CCR5-tropic HIV-1 as determined genotypically and functionally by virus culture. Pre-allo-HSCT, HIV-1-DNA could be detected in both patients in bone marrow, PBMCs, and T-cell subsets. Chimerism correlated with detectable HIV-1-DNA LTR copies in cells and tissues. Post-mortem analysis of IciS-05 revealed proviral DNA in all tissue biopsies, but not in PBMCs. In patient IciS-11, who was transplanted twice, no HIV-1-DNA could be detected in PBMCs at the time of death, whereas HIV-1-DNA was detectable in the lymph node. In conclusion, shortly after CCR5&Delta;32/&Delta;32, allo-HSCT HIV-1-DNA became undetectable in PBMCs. However, HIV-1-DNA variants identical to those present before transplantation persisted in post-mortem-obtained tissues, indicating that these tissues play an important role as viral reservoirs
    corecore