226 research outputs found
Inflammation and premature aging in advanced chronic kidney disease
Systemic inflammation in end-stage renal disease (ESRD) is an established risk factor for mortality and a catalyst for other complications which are related to a premature aging phenotype, including muscle wasting, vascular calcification and other forms of premature vascular disease, depression, osteoporosis and frailty. Uremic inflammation is also mechanistically related to mechanisms involved in the aging process, such as telomere shortening, mitochondrial dysfunction, and altered nutrient sensing, which can have direct effect on cellular and tissue function. In addition to uremia-specific causes such as abnormalities in the phosphate- Klotho axis, there are remarkable similarities between the pathophysiology of uremic inflammation and so-called "inflammaging" in the general population. Potentially relevant, but still somewhat unexplored in this respect are abnormal or misplaced protein structures as well as abnormalities in tissue homeostasis, which evoke danger signals through damage associated molecular patters (DAMPS) as well as the senescence associated secretory phenotype (SASP). Systemic inflammation, in combination with the loss of kidney function, can impair the resilience of the body to external and internal stressors by reduced functional and structural tissue reserve, and by impairing normal organ crosstalk, thus providing an explanation for the greatly increased risk of homeostatic breakdown in this population. In this review, the relation between uremic inflammation and a premature aging phenotype, as well as potential causes and consequences are discussed
Quantification and classification of potassium and calcium disorders with the electrocardiogram: What do clinical studies, modeling, and reconstruction tell us?
Diseases caused by alterations of ionic concentrations are frequently observed challenges and play an important role in clinical practice. The clinically established method for the diagnosis of electrolyte concentration imbalance is blood tests. A rapid and non-invasive point-of-care method is yet needed. The electrocardiogram (ECG) could meet this need and becomes an established diagnostic tool allowing home monitoring of the electrolyte concentration also by wearable devices. In this review, we present the current state of potassium and calcium concentration monitoring using the ECG and summarize results from previous work. Selected clinical studies are presented, supporting or questioning the use of the ECG for the monitoring of electrolyte concentration imbalances. Differences in the findings from automatic monitoring studies are discussed, and current studies utilizing machine learning are presented demonstrating the potential of the deep learning approach. Furthermore, we demonstrate the potential of computational modeling approaches to gain insight into the mechanisms of relevant clinical findings and as a tool to obtain synthetic data for methodical improvements in monitoring approaches
Lipid levels are inversely associated with infectious and all-cause mortality: international MONDO study results.
Cardiovascular (CV) events are increased 36-fold in patients with end-stage renal disease. However, randomized controlled trials to lower LDL cholesterol (LDL-C) and serum total cholesterol (TC) have not shown significant mortality improvements. An inverse association of TC and LDL-C with all-cause and CV mortality has been observed in patients on chronic dialysis. Lipoproteins also may protect against infectious diseases. We used data from 37,250 patients in the international Monitoring Dialysis Outcomes (MONDO) database to evaluate the association between lipids and infection-related or CV mortality. The study began on the first day of lipid measurement and continued for up to 4 years. We applied Cox proportional models with time-varying covariates to study associations of LDL-C, HDL cholesterol (HDL-C), and triglycerides (TGs) with all-cause, CV, infectious, and other causes of death. Overall, 6,147 patients died (19.2% from CV, 13.2% from infection, and 67.6% from other causes). After multivariable adjustment, higher LDL-C, HDL-C, and TGs were independently associated with lower all-cause death risk. Neither LDL-C nor TGs were associated with CV death, and HDL-C was associated with lower CV risk. Higher LDL-C and HDL-C were associated with a lower risk of death from infection or other non-CV causes. LDL-C was associated with reduced all-cause and infectious, but not CV mortality, which resulted in the inverse association with all-cause mortality
Seasonal variations in mortality and clinical indicators in international hemodialysis populations from the MONDO registry
BACKGROUND: Seasonal mortality differences have been reported in US hemodialysis (HD) patients. Here we examine the effect of seasons on mortality, clinical and laboratory parameters on a global scale. METHODS: Databases from the international Monitoring Dialysis Outcomes (MONDO) consortium were queried to identify patients who received in-center HD for at least 1 year. Clinics were stratified by hemisphere and climate zone (tropical or temperate). We recorded mortality and computed averages of pre-dialysis systolic blood pressure (pre-SBP), interdialytic weight gain (IDWG), serum albumin, and log C-reactive protein (CRP). We explored seasonal effects using cosinor analysis and adjusted linear mixed models globally, and after stratification. RESULTS: Data from 87,399 patients were included (northern temperate: 63,671; northern tropical: 7,159; southern temperate: 13,917; southern tropical: 2,652 patients). Globally, mortality was highest in winter. Following stratification, mortality was significantly lower in spring and summer compared to winter in temperate, but not in tropical zones. Globally, pre-SBP and IDWG were lower in summer and spring as compared to winter, although less pronounced in tropical zones. Except for southern temperate zone, serum albumin levels were higher in winter. CRP levels were highest in winter. CONCLUSION: Significant global seasonal variations in mortality, pre-SBP, IDWG, albumin and CRP were observed. Seasonal variations in mortality were most pronounced in temperate climate zones
Diagnostic performance of salivary urea nitrogen dipstick to detect and monitor acute kidney disease in patients with malaria
BACKGROUND: Acute kidney injury (AKI) is a common complication of malaria. In low resource settings, a lack of diagnostic tools and delayed treatment of malaria associated AKI lead to significant morbidity and mortality. The aim of this study was to assess the diagnostic performance of salivary urea nitrogen (SUN) dipstick to detect and monitor kidney disease [KD = AKI or acute kidney disease (AKD) without AKI] in malaria patients in Angola. METHODS: Patients 11–50 years old admitted with malaria at the Josina Machel (Maria-Pia) Hospital, Luanda, Angola, between 2nd March and 10th May 2016 were enrolled in this study. All participants had serum creatinine (sCr), blood urea nitrogen (BUN) and SUN dipstick tested at the time of recruitment and daily for up to 4 days. AKD without AKI refers to acute renal impairment which do not fulfilled the main criteria for AKI (increases in the baseline serum creatinine and/or decreases in urine output) according defined by the kidney disease improving global outcomes (KDIGO) guideline. RESULTS: Eight-six patients were admitted with malaria diagnosis (mean age 21.5 ± 9.4 years, 71% male) and 27 (32%) were diagnosed with KD. The mean (± SD) sCr and BUN of the KD group at admission (day 0) were 5.38 (± 5.42) and 99.4 (± 61.9) mg/dL, respectively. Three (3.5%) patients underwent haemodialysis and eight (9.3%) died within the first 4 days of hospital admission [5 (62.5%) with KD; 3 (37.5%) without kidney disease; p = 0.047]. The SUN threshold for KD diagnosis was tested pad #5 (SUN > 54 mg/dL). At this threshold, the SUN dipstick had a sensitivity of 67% and specificity of 98% to diagnose KD. The area under the receiver operating characteristics curve (ROC) for KD diagnosis on admission was 0.88 (95% CI 0.79–0.96). The SUN dipstick was most accurate at higher levels of BUN. CONCLUSION: The SUN dipstick had reasonable sensitivity and excellent specificity when used to diagnose KD in a cohort of patients with malaria in a resource-limited setting. Given the severity of presenting illness and kidney injury, the SUN dipstick diagnostic threshold was high (test pad #5). SUN may be used to detect AKI in patients with malaria in low resources settings, thus facilitating earlier access to adequate treatment, which may improve survival
Diagnostic Performance of a Saliva Urea Nitrogen Dipstick to Detect Kidney Disease in Malawi
Introduction: Kidney disease (KD), including acute kidney injury, is common, severe and leads to significant mortality in the developing world. However, simple tools to facilitate diagnosis and guide treatment are lacking. We studied the diagnostic performance of saliva urea nitrogen (SUN) measured by dipstick to diagnose KD in a low-resource setting.
Methods: Medical admissions to a tertiary hospital in Malawi had serum creatinine tested at presentation; SUN was measured using a dipstick. Patients with serum creatinine above normal range underwent serial measurements of SUN and blood urea nitrogen for up to 7 days. Hospital outcome was recorded in all patients. Results: A total of 742 patients were included (age 41 ± 17·3 years, 56.1% male); 146 (19.7%) had KD, including 114 (15.4%) with acute kidney injury. SUN >14 mg/dl had a sensitivity of 0.72 and a specificity of 0.87 to diagnose KD; specificity increased to 0.97 when SUN levels were combined with self-reported urine output. The diagnostic performance of SUN was comparable with the one of blood urea nitrogen (SUN area under curve, 0.82; 95% confidence interval, 0.78–0.87; blood urea nitrogen area under curve, 0.82; 95% confidence interval, 0.59–1.0). SUN >14 mg/dl on admission was an independent predictor of all-cause mortality (hazard ratio = 2.43 [95% confidence interval, 1.63–3.62]). Discussion: SUN measured by dipstick can be used to identify patients with KD in a low-resource setting. SUN is an independent predictor of mortality in this population
A Distinct Urinary Biomarker Pattern Characteristic of Female Fabry Patients That Mirrors Response to Enzyme Replacement Therapy
Female patients affected by Fabry disease, an X-linked lysosomal storage disorder, exhibit a wide spectrum of symptoms, which renders diagnosis, and treatment decisions challenging. No diagnostic test, other than sequencing of the alpha-galactosidase A gene, is available and no biomarker has been proven useful to screen for the disease, predict disease course and monitor response to enzyme replacement therapy. Here, we used urine proteomic analysis based on capillary electrophoresis coupled to mass spectrometry and identified a biomarker profile in adult female Fabry patients. Urine samples were taken from 35 treatment-naive female Fabry patients and were compared to 89 age-matched healthy controls. We found a diagnostic biomarker pattern that exhibited 88.2% sensitivity and 97.8% specificity when tested in an independent validation cohort consisting of 17 treatment-naive Fabry patients and 45 controls. The model remained highly specific when applied to additional control patients with a variety of other renal, metabolic and cardiovascular diseases. Several of the 64 identified diagnostic biomarkers showed correlations with measures of disease severity. Notably, most biomarkers responded to enzyme replacement therapy, and 8 of 11 treated patients scored negative for Fabry disease in the diagnostic model. In conclusion, we defined a urinary biomarker model that seems to be of diagnostic use for Fabry disease in female patients and may be used to monitor response to enzyme replacement therapy
Frequency of Fabry disease in male and female haemodialysis patients in Spain
<p>Abstract</p> <p>Background</p> <p>Fabry disease (FD), an X-linked lysosomal storage disorder, is caused by a reduced activity of the lysosomal enzyme α-galactosidase A. The disorder ultimately leads to organ damage (including renal failure) in males and females. However, heterozygous females usually present a milder phenotype with a later onset and a slower progression.</p> <p>Methods</p> <p>A combined enzymatic and genetic strategy was used, measuring the activity of α-galactosidase A and genotyping the α-galactosidase A gene (<it>GLA</it>) in dried blood samples (DBS) of 911 patients undergoing haemodialysis in centers across Spain.</p> <p>Results</p> <p><it>GLA </it>alterations were found in seven unrelated patients (4 males and 3 females). Two novel mutations (p.Gly346AlafsX347 and p.Val199GlyfsX203) were identified as well as a previously described mutation, R118C. The R118C mutation was present in 60% of unrelated patients with <it>GLA </it>causal mutations. The D313Y alteration, considered by some authors as a pseudo-deficiency allele, was also found in two out of seven patients.</p> <p>Conclusions</p> <p>Excluding the controversial D313Y alteration, FD presents a frequency of one in 182 individuals (0.55%) within this population of males and females undergoing haemodialysis. Moreover, our findings suggest that a number of patients with unexplained and atypical symptoms of renal disease may have FD. Screening programmes for FD in populations of individuals presenting severe kidney dysfunction, cardiac alterations or cerebrovascular disease may lead to the diagnosis of FD in those patients, the study of their families and eventually the implementation of a specific therapy.</p
Endotoxaemia in Haemodialysis: A Novel Factor in Erythropoetin Resistance?
Background/Objectives
Translocated endotoxin derived from intestinal bacteria is a driver of systemic inflammation and oxidative stress. Severe endotoxaemia is an underappreciated, but characteristic finding in haemodialysis (HD) patients, and appears to be driven by acute repetitive dialysis induced circulatory stress. Resistance to erythropoietin (EPO) has been identified as a predictor of mortality risk, and associated with inflammation and malnutrition. This study aims to explore the potential link between previously unrecognised endotoxaemia and EPO Resistance Index (ERI) in HD patients.
Methodology/Principal Findings
50 established HD patients were studied at a routine dialysis session. Data collection included weight, BMI, ultrafiltration volume, weekly EPO dose, and blood sampling pre and post HD. ERI was calculated as ratio of total weekly EPO dose to body weight (U/kg) to haemoglobin level (g/dL). Mean haemoglobin (Hb) was 11.3±1.3 g/dL with a median EPO dose of 10,000 [IQR 7,500–20,000] u/wk and ERI of 13.7 [IQR 6.9–23.3] ((U/Kg)/(g/dL)). Mean pre-HD serum ET levels were significantly elevated at 0.69±0.30 EU/ml. Natural logarithm (Ln) of ERI correlated to predialysis ET levels (r = 0.324, p = 0.03) with a trend towards association with hsCRP (r = 0.280, p = 0.07). Ln ERI correlated with ultrafiltration volume, a driver of circulatory stress (r = 0.295, p = 0.046), previously identified to be associated with increased intradialytic endotoxin translocation. Both serum ET and ultrafiltration volume corrected for body weight were independently associated with Ln ERI in multivariable analysis.
Conclusions
This study suggests that endotoxaemia is a significant factor in setting levels of EPO requirement. It raises the possibility that elevated EPO doses may in part merely be identifying patients subjected to significant circulatory stress and suffering the myriad of negative biological consequences arising from sustained systemic exposure to endotoxin
- …