740 research outputs found

    The Potential of Hyperspectral Patterns of Winter Wheat to Detect Changes in Soil Microbial Community Composition

    Get PDF
    Reliable information on soil status and crop health is crucial for detecting and mitigating disasters like pollution or minimizing impact from soil-borne diseases. While infestation with an aggressive soil pathogen can be detected via reflected light spectra, it is unknown to what extent hyperspectral reflectance could be used to detect overall changes in soil biodiversity. We tested the hypotheses that spectra can be used to (1) separate plants growing with microbial communities from different farms; (2) to separate plants growing in different microbial communities due to different land use; and (3) separate plants according to microbial species loss. We measured hyperspectral reflectance patterns of winter wheat plants growing in sterilized soils inoculated with microbial suspensions under controlled conditions. Microbial communities varied due to geographical distance, land use and microbial species loss caused by serial dilution. After 3 months of growth in the presence of microbes from the two different farms plant hyperspectral reflectance patterns differed significantly from each other, while within farms the effects of land use via microbes on plant reflectance spectra were weak. Species loss via dilution on the other hand affected a number of spectral indices for some of the soils. Spectral reflectance can be indicative of differences in microbial communities, with the Renormalized Difference Vegetation Index the most common responding index. Also, a positive correlation was found between the Normalized Difference Vegetation Index and the bacterial species richness, which suggests that plants perform better with higher microbial diversity. There is considerable variation between the soil origins and currently it is not possible yet to make sufficient reliable predictions about the soil microbial community based on the spectral reflectance. We conclude that measuring plant hyperspectral reflectance has potential for detecting changes in microbial communities yet due to its sensitivity high replication is necessary and a strict sampling design to exclude other ‘noise’ factors.</p

    Post-implantation clinical cost analysis between transcutaneous and percutaneous bone conduction devices

    Get PDF
    Introduction: Bone conduction devices (BCD) are effective for hearing rehabilitation in patients with conductive and mixed hearing loss or single-sided deafness. Transcutaneous bone conduction devices (tBCD) seem to lead to fewer soft tissue complications than percutaneous BCDs (pBCD) but have other drawbacks such as MRI incompatibility and higher costs. Previous cost analyses have shown a cost advantage of tBCDs. The purpose of this study is to compare long-term post-implantations costs between percutaneous and transcutaneous BCDs. Materials and methods: Retrospective data from 77 patients implanted in a tertiary referral centre with a pBCD (n = 34), tBCD (n = 43; passive (tpasBCD; n = 34) and active (tactBCD; n = 9) and a reference group who underwent cochlear implantation (CI; n = 34), were included in a clinical cost analysis. Post-implantation costs were determined as the sum of consultation (medical and audiological) and additional (all post-operative care) costs. Median (cumulative) costs per device incurred for the different cohorts were compared at 1, 3 and 5 years after implantation. Results: After 5 years, the total post-implantation costs of the pBCD vs tpasBCD were not significantly different (€1550.7 [IQR 1174.6–2797.4] vs €2266.9 [IQR 1314.1–3535.3], p = 0.185), nor was there a significant difference between pBCD vs tactBCD (€1550.7 [1174.6–2797.4] vs €1428.8 [1277.3–1760.4], p = 0.550). Additional post-implantation costs were significantly highest in the tpasBCD cohort at all moments of follow-up. Conclusion: Total costs related to post-operative rehabilitation and treatments are comparable between percutaneous and transcutaneous BCDs up to 5 years after implantation. Complications related to passive transcutaneous bone conduction devices appeared significantly more expensive after implantation due to more frequent explantations.</p

    Post-implantation clinical cost analysis between transcutaneous and percutaneous bone conduction devices

    Get PDF
    Introduction: Bone conduction devices (BCD) are effective for hearing rehabilitation in patients with conductive and mixed hearing loss or single-sided deafness. Transcutaneous bone conduction devices (tBCD) seem to lead to fewer soft tissue complications than percutaneous BCDs (pBCD) but have other drawbacks such as MRI incompatibility and higher costs. Previous cost analyses have shown a cost advantage of tBCDs. The purpose of this study is to compare long-term post-implantations costs between percutaneous and transcutaneous BCDs. Materials and methods: Retrospective data from 77 patients implanted in a tertiary referral centre with a pBCD (n = 34), tBCD (n = 43; passive (tpasBCD; n = 34) and active (tactBCD; n = 9) and a reference group who underwent cochlear implantation (CI; n = 34), were included in a clinical cost analysis. Post-implantation costs were determined as the sum of consultation (medical and audiological) and additional (all post-operative care) costs. Median (cumulative) costs per device incurred for the different cohorts were compared at 1, 3 and 5 years after implantation. Results: After 5 years, the total post-implantation costs of the pBCD vs tpasBCD were not significantly different (€1550.7 [IQR 1174.6–2797.4] vs €2266.9 [IQR 1314.1–3535.3], p = 0.185), nor was there a significant difference between pBCD vs tactBCD (€1550.7 [1174.6–2797.4] vs €1428.8 [1277.3–1760.4], p = 0.550). Additional post-implantation costs were significantly highest in the tpasBCD cohort at all moments of follow-up. Conclusion: Total costs related to post-operative rehabilitation and treatments are comparable between percutaneous and transcutaneous BCDs up to 5 years after implantation. Complications related to passive transcutaneous bone conduction devices appeared significantly more expensive after implantation due to more frequent explantations.</p

    Living Donor Kidney Transplantation Should Be Promoted among "elderly" Patients

    Get PDF
    Background. Age criteria for kidney transplantation have been liberalized over the years resulting in more waitlisted elderly patients. What are the prospects of elderly patients on the waiting list? Methods. Between 2000 and 2013, 2622 patients had been waitlisted. Waiting time was defined as the period between dialysis onset and being delisted. Patients were categorized according to age upon listing: 64 years. Furthermore, the influence of ABO blood type and panel reactive antibodies on outflow patterns was studied. Results. At the end of observation (November 2017), 1957 (75%) patients had been transplanted, 333 (13%) had been delisted without a transplantation, 271 (10%) had died, and 61 (2%) were still waiting. When comparing the age categories, outflow patterns were completely different. The percentage of patients transplanted decreased with increasing age, while the percentage of patients that had been delisted or had died increased with increasing age, especially in the population without living donor. Within 6 years, 93% of the population 55 years, 39% received a living donor kidney, while >50% of patients without a living donor had been delisted/died. Multivariable analysis showed that the influence of age, ABO blood type, and panel reactive antibodies on outflow patterns was significant, but the magnitude of the influence of the latter 2 was only modest compared with that of age. Conclusions. "Elderly" (not only >64 y but even 55-64 y) received a living donor kidney transplantation less often. Moreover, they cannot bear the waiting time for a deceased donor kidney, resulting in delisting without a transplant in more than half the population of patients without a living donor. Promoting living donor kidney transplantation is the only modification that improves transplantation and decreases delisting/death on the waiting list in this population

    Inflammatory Profile of Awake Function-Controlled Craniotomy and Craniotomy under General Anesthesia

    Get PDF
    Background. Surgical stress triggers an inflammatory response and releases mediators into human plasma such as interleukins (ILs). Awake craniotomy and craniotomy performed under general anesthesia may be associated with different levels of stress. Our aim was to investigate whether those procedures cause different inflammatory responses. Methods. Twenty patients undergoing craniotomy under general anesthesia and 20 patients undergoing awake function-controlled craniotomy were included in this prospective, observational, two-armed study. Circulating levels of IL-6, IL-8, and IL-10 were determined pre-, peri-, and postoperatively in both patient groups. VAS scores for pain, anxiety, and stress were taken at four moments pre- and postoperatively to evaluate physical pain and mental duress. Results. Plasma IL-6 level significantly increased with time similarly in both groups. No significant plasma IL-8 and IL-10 change was observed in both experimental groups. The VAS pain score was significantly lower in the awake group compared to the anesthesia group at 12 hours postoperative. Postoperative anxiety and stress declined similarly in both groups. Conclusion. This study suggests that awake function-controlled craniotomy does not cause a significantly different inflammatory response than craniotomy performed under general anesthesia. It is also likely that function-controlled craniotomy does not cause a greater emotional challenge than tumor resection under general anesthesia
    corecore