112 research outputs found

    Lebensqualität von Nierenlebendspendern in Abhängigkeit des Operationsverfahrens offen, konventionell vs. minimal-invasiv: „Verbessert das minimal invasive Verfahren der Nephrektomie die Lebensqualität von Nierenlebendspendern?"

    Get PDF
    Die Nierenlebendspende ist eine geeignete Methode, um die Wartezeit auf ein passendes Spenderorgan zu verringern und die Dauer einer Dialysebehandlung zu verkürzen oder diese zu verhindern. Der retroperitoneale minimalinvasive Zugang in der Lebendnierenspende bietet eine attraktive Alternative zu dem traditionell offenen Verfahren. Die Hospitalisierung und Erholungsphase der Patienten verlaufen signifikant kürzer, der Bedarf an analgetisch wirksamen Medikamenten ist herab gesetzt und dieser Zugangsweg bietet ein gutes kosmetisches Ergebnis für die Spender. Diese Studie wurde durchgeführt, um die Einflüsse der Operationstechniken auf die Lebensqualität der Lebendspender zu untersuchen. Das Ziel dieser Studie war, mit Hilfe des Fragebogens Short Form-36 Version 2 Unterschiede in der Lebensqualität zwischen MIDN und ODN Spendern zu finden Des weiteren untersuchten wir, ob sich die Lebensqualität von dem Zeitpunkt vor der Nephrektomie im Verlauf eines Jahres veränderte und ob es Unterschiede in der Lebensqualität zwischen unseren Spendern im Vergleich zu den normierten Ergebnissen der allgemeinen US-Bevölkerung gab. In der hier vorliegenden Arbeit konnte gezeigt werden, dass die Lebensqualität der MIDN und ODN Spender während des untersuchten Zeitraumes, präoperativ, eine Woche, ein Monat und ein Jahr postoperativ, gleichwertig war. Lediglich in der Kategorie „Bodily Pain“ war eine leichte Tendenz hinsichtlich einer besseren Lebensqualität bei den MIDN Spendern zu verzeichnen. Die „Physical Component Summary“ (PCS) verminderte sich in beiden Gruppen im Zeitraum vor bis eine Woche nach der Nephrektomie. Unabhängig von der Operationstechnik verbesserte sich jedoch das physische Wohlbefinden drei Monate und ein Jahr nach dem Eingriff. Die Werte der „Mental Component Summary“ (MCS) veränderten sich weder bei den MIDN- noch bei den ODN-Spendern. Im Vergleich zu der allgemeinen US-Bevölkerung lagen die Werte der PCS der MIDN und ODN Spender vor und ein Jahr nach dem operativen Eingriff deutlich über denen der allgemeinen US-Bevölkerung. Die MCS-Werte waren zunächst vor und eine Woche nach der Entnahme mit denen der allgemeinen US-Bevölkerung zu vergleichen. Nach drei Monaten und einem Jahr lagen diese Werte bei den MIDN und ODN Spendern signifikant über denen der normierten Vergleichsgruppe. Zusammenfassend ließen die Ergebnisse dieser Studie keine bessere Lebensqualität der MIDN gegenüber den ODN Lebendspendern erkennen. Die Durchführung der Umfrage mithilfe des standardisierten Fragebogens SF-36v2 ist jedoch ein sehr geeignetes, praktisches und universell einsetzbares Mittel, um die Genesung der Patienten nach operativen Eingriffen einzuschätzen und die Entwicklung zu beurteilen

    Taurine Enhances Iron-Related Proteins and Reduces Lipid Peroxidation in Differentiated C2C12 Myotubes

    Get PDF
    Taurine is a nonproteinogenic amino sulfonic acid in mammals. Interestingly, skeletal muscle is unable to synthesize taurine endogenously, and the processing of muscular taurine changes throughout ageing and under specific pathophysiological conditions, such as muscular dystrophy. Ageing and disease are also associated with altered iron metabolism, especially when there is an excess of labile iron. The present study addresses the question of whether taurine connects cytoprotective effects and redox homeostasis in a previously unknown iron-dependent manner. Using cultured differentiated C2C12 myotubes, the impact of taurine on markers of lipid peroxidation, redox-sensitive enzymes and iron-related proteins was studied. Significant increases in the heme protein myoglobin and the iron storage protein ferritin were observed in response to taurine treatment. Taurine supplementation reduced lipid peroxidation and BODIPY oxidation by ~60 and 25%, respectively. Furthermore, the mRNA levels of redox-sensitive heme oxygenase (Hmox1), catalase (Cat) and glutamate-cysteine ligase (Gclc) and the total cellular glutathione content were lower in taurine-supplemented cells than they were in the control cells. We suggest that taurine may inhibit the initiation and propagation of lipid peroxidation by lowering basal levels of cellular stress, perhaps through reduction of the cellular labile iron pool

    Parameter Identifiability of Artemisinin Synthesis using Design of Experiments

    Get PDF
    Artemisinin-based combination therapies are recommended by the World Health Organization to treat malaria, one of the most abundant infectious diseases in the world. Recently, a novel production route, which combines the extraction and the catalyzed chemical synthesis, has been shown to be a promising sustainable processing alternative [Triemer, 2018]. To exploit its mechanism, operational settings and limits, mathematical modeling might be beneficial when thorough system insight is required. In a first step, we consider the catalyzed synthesis step from dihydroartemisinic acid to artemisinin, and we show that only a subset of the parameters of the considered model is identifiable with the available sparse data using a singular value decomposition approach. In a second step, within the framework of design of experiments (DoE), we demonstrate the effect of additional experimental data to overcome the non-identifiability problem of the model parameters

    How to catch more prey with less effective traps: explaining the evolution of temporarily inactive traps in carnivorous pitcher plants.

    Get PDF
    Carnivorous Nepenthes pitcher plants capture arthropods with specialized slippery surfaces. The key trapping surface, the pitcher rim (peristome), is highly slippery when wetted by rain, nectar or condensation, but not when dry. As natural selection should favour adaptations that maximize prey intake, the evolution of temporarily inactive traps seems paradoxical. Here, we show that intermittent trap deactivation promotes 'batch captures' of ants. Prey surveys revealed that N. rafflesiana pitchers sporadically capture large numbers of ants from the same species. Continuous experimental wetting of the peristome increased the number of non-recruiting prey, but decreased the number of captured ants and shifted their trapping mode from batch to individual capture events. Ant recruitment was also lower to continuously wetted pitchers. Our experimental data fit a simple model that predicts that intermittent, wetness-based trap activation should allow safe access for 'scout' ants under dry conditions, thereby promoting recruitment and ultimately higher prey numbers. The peristome trapping mechanism may therefore represent an adaptation for capturing ants. The relatively rare batch capture events may particularly benefit larger plants with many pitchers. This explains why young plants of many Nepenthes species additionally employ wetness-independent, waxy trapping surfaces

    Lithium Content of 160 Beverages and Its Impact on Lithium Status in Drosophila melanogaster

    Get PDF
    Lithium (Li) is an important micronutrient in human nutrition, although its exact molecular function as a potential essential trace element has not yet been fully elucidated. It has been previously shown that several mineral waters are rich and highly bioavailable sources of Li for human consumption. Nevertheless, little is known about the extent in which other beverages contribute to the dietary Li supply. To this end, the Li content of 160 different beverages comprising wine and beer, soft and energy drinks and tea and coffee infusions was analysed by inductively coupled plasma mass spectrometry (ICP-MS). Furthermore, a feeding study in Drosophila melanogaster was conducted to test whether Li derived from selected beverages changes Li status in flies. In comparison to the average Li concentration in mineral waters (108 µg/L; reference value), the Li concentration in wine (11.6 ± 1.97 µg/L) and beer (8.5 ± 0.77 µg/L), soft and energy drinks (10.2 ± 2.95 µg/L), tea (2.8 ± 0.65 µg/L) and coffee (0.1 ± 0.02 µg/L) infusions was considerably lower. Only Li-rich mineral water (~1600 µg/L) significantly increased Li concentrations in male and female flies. Unlike mineral water, most wine and beer, soft and energy drink and tea and coffee samples were rather Li-poor food items and thus may only contribute to a moderate extent to the dietary Li supply. A novelty of this study is that it relates analytical Li concentrations in beverages to Li whole body retention in Drosophila melanogaster

    Dietary lithium intake, graft failure and mortality in kidney transplant recipients

    Get PDF
    BACKGROUND &amp; AIMS: Long-term high dose lithium therapy in bipolar disorder is known to adversely affect kidney function. However, recent animal studies revealed that low amounts of lithium are beneficial for the kidney when it is damaged by exposure to nephrotoxic compounds, inflammation, or oxidative stress. This study aimed to investigate whether urinary lithium excretion, reflecting dietary lithium intake, is associated with adverse long-term kidney graft outcomes and patient survival.METHODS: Urinary lithium concentration was measured using inductively coupled plasma-mass-spectrometry in 642 stable kidney transplant recipients. Graft failure was defined as start of dialysis or re-transplantation, and kidney function decline was defined as doubling of serum creatinine.RESULTS: Median [interquartile range] urinary lithium excretion was 3.03 [2.31-4.01] μmol/24 h. Urinary lithium excretion was associated with energy, plant protein and water intake. During a median follow-up of 5.3 [4.5-6.0] years, 79 (12%) KTR developed graft failure and 127 (20%) KTR developed kidney function decline. Higher urinary lithium excretion was associated with lower risk of graft failure (hazard ratio [95% confidence interval]: 0.54 [0.38-0.79] per log2 μmol/24 h) and kidney function decline (HR [95% CI]: 0.73 [0.54-0.99] per log2 μmol/24 h). These associations remained independent of adjustment for potential confounders and in sensitivity analyses. There was significant effect modification by use of proliferation inhibitors (P = 0.05) and baseline eGFR (P &lt; 0.001), with higher urinary lithium excretion being more protective in KTR not using proliferation inhibitors and in KTR with lower baseline eGFR. Furthermore, higher urinary lithium excretion was associated with reduced risk of all-cause mortality (HR [95% CI]: 0.64 [0.49-0.83]; P = 0.001).CONCLUSION: Dietary lithium intake may be a potentially modifiable-yet rather overlooked-risk factor for adverse long-term kidney graft outcomes and patient survival.</p

    Dietary lithium intake, graft failure and mortality in kidney transplant recipients

    Get PDF
    BACKGROUND &amp; AIMS: Long-term high dose lithium therapy in bipolar disorder is known to adversely affect kidney function. However, recent animal studies revealed that low amounts of lithium are beneficial for the kidney when it is damaged by exposure to nephrotoxic compounds, inflammation, or oxidative stress. This study aimed to investigate whether urinary lithium excretion, reflecting dietary lithium intake, is associated with adverse long-term kidney graft outcomes and patient survival.METHODS: Urinary lithium concentration was measured using inductively coupled plasma-mass-spectrometry in 642 stable kidney transplant recipients. Graft failure was defined as start of dialysis or re-transplantation, and kidney function decline was defined as doubling of serum creatinine.RESULTS: Median [interquartile range] urinary lithium excretion was 3.03 [2.31-4.01] μmol/24 h. Urinary lithium excretion was associated with energy, plant protein and water intake. During a median follow-up of 5.3 [4.5-6.0] years, 79 (12%) KTR developed graft failure and 127 (20%) KTR developed kidney function decline. Higher urinary lithium excretion was associated with lower risk of graft failure (hazard ratio [95% confidence interval]: 0.54 [0.38-0.79] per log2 μmol/24 h) and kidney function decline (HR [95% CI]: 0.73 [0.54-0.99] per log2 μmol/24 h). These associations remained independent of adjustment for potential confounders and in sensitivity analyses. There was significant effect modification by use of proliferation inhibitors (P = 0.05) and baseline eGFR (P &lt; 0.001), with higher urinary lithium excretion being more protective in KTR not using proliferation inhibitors and in KTR with lower baseline eGFR. Furthermore, higher urinary lithium excretion was associated with reduced risk of all-cause mortality (HR [95% CI]: 0.64 [0.49-0.83]; P = 0.001).CONCLUSION: Dietary lithium intake may be a potentially modifiable-yet rather overlooked-risk factor for adverse long-term kidney graft outcomes and patient survival.</p

    Dietary lithium intake, graft failure and mortality in kidney transplant recipients

    Get PDF
    BACKGROUND &amp; AIMS: Long-term high dose lithium therapy in bipolar disorder is known to adversely affect kidney function. However, recent animal studies revealed that low amounts of lithium are beneficial for the kidney when it is damaged by exposure to nephrotoxic compounds, inflammation, or oxidative stress. This study aimed to investigate whether urinary lithium excretion, reflecting dietary lithium intake, is associated with adverse long-term kidney graft outcomes and patient survival.METHODS: Urinary lithium concentration was measured using inductively coupled plasma-mass-spectrometry in 642 stable kidney transplant recipients. Graft failure was defined as start of dialysis or re-transplantation, and kidney function decline was defined as doubling of serum creatinine.RESULTS: Median [interquartile range] urinary lithium excretion was 3.03 [2.31-4.01] μmol/24 h. Urinary lithium excretion was associated with energy, plant protein and water intake. During a median follow-up of 5.3 [4.5-6.0] years, 79 (12%) KTR developed graft failure and 127 (20%) KTR developed kidney function decline. Higher urinary lithium excretion was associated with lower risk of graft failure (hazard ratio [95% confidence interval]: 0.54 [0.38-0.79] per log2 μmol/24 h) and kidney function decline (HR [95% CI]: 0.73 [0.54-0.99] per log2 μmol/24 h). These associations remained independent of adjustment for potential confounders and in sensitivity analyses. There was significant effect modification by use of proliferation inhibitors (P = 0.05) and baseline eGFR (P &lt; 0.001), with higher urinary lithium excretion being more protective in KTR not using proliferation inhibitors and in KTR with lower baseline eGFR. Furthermore, higher urinary lithium excretion was associated with reduced risk of all-cause mortality (HR [95% CI]: 0.64 [0.49-0.83]; P = 0.001).CONCLUSION: Dietary lithium intake may be a potentially modifiable-yet rather overlooked-risk factor for adverse long-term kidney graft outcomes and patient survival.</p

    Dietary lithium intake, graft failure and mortality in kidney transplant recipients

    Get PDF
    BACKGROUND &amp; AIMS: Long-term high dose lithium therapy in bipolar disorder is known to adversely affect kidney function. However, recent animal studies revealed that low amounts of lithium are beneficial for the kidney when it is damaged by exposure to nephrotoxic compounds, inflammation, or oxidative stress. This study aimed to investigate whether urinary lithium excretion, reflecting dietary lithium intake, is associated with adverse long-term kidney graft outcomes and patient survival.METHODS: Urinary lithium concentration was measured using inductively coupled plasma-mass-spectrometry in 642 stable kidney transplant recipients. Graft failure was defined as start of dialysis or re-transplantation, and kidney function decline was defined as doubling of serum creatinine.RESULTS: Median [interquartile range] urinary lithium excretion was 3.03 [2.31-4.01] μmol/24 h. Urinary lithium excretion was associated with energy, plant protein and water intake. During a median follow-up of 5.3 [4.5-6.0] years, 79 (12%) KTR developed graft failure and 127 (20%) KTR developed kidney function decline. Higher urinary lithium excretion was associated with lower risk of graft failure (hazard ratio [95% confidence interval]: 0.54 [0.38-0.79] per log2 μmol/24 h) and kidney function decline (HR [95% CI]: 0.73 [0.54-0.99] per log2 μmol/24 h). These associations remained independent of adjustment for potential confounders and in sensitivity analyses. There was significant effect modification by use of proliferation inhibitors (P = 0.05) and baseline eGFR (P &lt; 0.001), with higher urinary lithium excretion being more protective in KTR not using proliferation inhibitors and in KTR with lower baseline eGFR. Furthermore, higher urinary lithium excretion was associated with reduced risk of all-cause mortality (HR [95% CI]: 0.64 [0.49-0.83]; P = 0.001).CONCLUSION: Dietary lithium intake may be a potentially modifiable-yet rather overlooked-risk factor for adverse long-term kidney graft outcomes and patient survival.</p
    corecore