278 research outputs found
Potassium administration increases and potassium deprivation reduces urinary calcium excretion in healthy adults
Potassium administration increases and potassium deprivation reduces urinary calcium excretion in healthy adults. This study was undertaken to evaluate the effects of dietary K intake, independent of whether the accompanying anion is Cl- or HCO3-, on urinary Ca excretion in healthy adults. The effects of KCl, KHCO3, NaCl and NaHCO3 supplements, 90 mmol/day for four days, were compared in ten subjects fed normal constant diets. Using synthetic diets, the effects of dietary KCl-deprivation for five days followed by recovery were assessed in four subjects and of KHCO3-deprivation for five days followed by recovery were assessed in four subjects. On the fourth day of salt administration, daily urinary Ca excretion and fasting UCaV/GFR were lower during the administration of KCl than during NaCl supplements (Δ = -1.11 ± 0.28 SEM mmol/day; P < 0.005 and -0.0077 ± 0.0022 mmol/liter GFR; P < 0.01), and lower during KHCO3 than during control (-1.26 ± 0.29 mmol/day; P < 0.005 and -0.0069 ± 0.0019 mmol/liter GFR; P = 0.005). Both dietary KCl and KHCO3 deprivation (mean reduction in dietary K intake -67 ± 8 mmol/day) were accompanied by an increase in daily urinary Ca excretion and fasting UCaV/GFR that averaged on the fifth day +1.31 ± 0.25 mmol/day (P < 0.005) and +0.0069 ± 0.0012 mmol/liter GFR (P < 0.005) above control. Both daily urinary Ca excretion and fasting UCaV/GFR returned toward or to control at the end of recovery. These observations indicate that: 1) KHCO3 decreases fasting and 24-hour urinary Ca excretion; 2) KCl nor NaHCO3, unlike NaCl, do not increase fasting or 24-hour Ca excretion and 3) K deprivation increases both fasting and 24-hour urinary Ca excretion whether the accompanying anion is Cl- or HCO3-. The mechanisms for this effect of K may be mediated by: 1) alterations in ECF volume, since transient increases in urinary Na and CI excretion and weight loss accompanied KCl or KHCO3 administration, while persistent reductions in urinary Na and Cl excretion and a trend for weight gain accompanied K deprivation; 2) K mediated alterations in renal tubular phosphate transport and renal synthesis of 1, 25-(OH)2-vitamin D, since KCl or KHCO3 administration tended to be accompanied by a rise in fasting serum PO4 and TmPO4 and a fall in fasting UPO4 V/GFR, a fall in serum 1,25-(OH)2-D and a decrease in fasting UCaV/GFR, while dietary KCl or KHCO3 deprivation were accompanied by a reverse sequence
A prospective evaluation of the predictive value of faecal calprotectin in quiescent Crohn’s disease
Background: The faecal calprotectin (FC) test is a non-invasive marker for gastrointestinal inflammation.
Aim: To determine whether higher FC levels in individuals with quiescent Crohn’s disease are associated with clinical relapse over the ensuing 12 months.<p></p>
Methods: A single centre prospective study was undertaken in Crohn's disease patients in clinical remission attending for routine review. The receiver operating characteristic (ROC) curve for the primary endpoint of clinical relapse by 12 months, based on FC at baseline, was calculated. Kaplan-Meier curves of time to relapse were based on the resulting optimal FC cutoff for predicting relapse.<p></p>
Results: Of 97 patients recruited, 92 were either followed up for 12 months without relapsing, or reached the primary endpoint within that period. Of these, 10 (11%) had relapsed by 12 months. The median FC was lower for non-relapsers, 96µg/g (IQR 39-237), than for relapsers, 414µg/g (IQR 259-590), (p=0.005). The area under the ROC curve to predict relapse using FC was 77.4%. An optimal cutoff FC value of 240µg/g to predict relapse of quiescent Crohn’s had sensitivity of 80.0% and specificity of 74.4%. Negative predictive value was 96.8% and positive predictive value was 27.6%. FC≥240μg/g was associated with likelihood of relapse 5.7 (95% CI 1.9-17.3) times higher within 2.3 years than lower values (p=0.002).<p></p>
Conclusions: In this prospective dataset, FC appears to be a useful, non-invasive tool to help identify quiescent Crohn’s disease patients at a low risk of relapse over the ensuing 12 months. FC of 240µg/g was the optimal cutoff in this cohort.<p></p>
Low urine pH and acid excretion do not predict bone fractures or the loss of bone mineral density: a prospective cohort study
<p>Abstract</p> <p>Background</p> <p>The acid-ash hypothesis, the alkaline diet, and related products are marketed to the general public. Websites, lay literature, and direct mail marketing encourage people to measure their urine pH to assess their health status and their risk of osteoporosis.</p> <p>The objectives of this study were to determine whether 1) low urine pH, or 2) acid excretion in urine [sulfate + chloride + 1.8x phosphate + organic acids] minus [sodium + potassium + 2x calcium + 2x magnesium mEq] in fasting morning urine predict: a) fragility fractures; and b) five-year change of bone mineral density (BMD) in adults.</p> <p>Methods</p> <p>Design: Cohort study: the prospective population-based Canadian Multicentre Osteoporosis Study. Multiple logistic regression was used to examine associations between acid excretion (urine pH and urine acid excretion) in fasting morning with the incidence of fractures (6804 person years). Multiple linear regression was used to examine associations between acid excretion with changes in BMD over 5-years at three sites: lumbar spine, femoral neck, and total hip (n = 651). Potential confounders controlled included: age, gender, family history of osteoporosis, physical activity, smoking, calcium intake, vitamin D status, estrogen status, medications, renal function, urine creatinine, body mass index, and change of body mass index.</p> <p>Results</p> <p>There were no associations between either urine pH or acid excretion and either the incidence of fractures or change of BMD after adjustment for confounders.</p> <p>Conclusion</p> <p>Urine pH and urine acid excretion do not predict osteoporosis risk.</p
History, epidemiology and regional diversities of urolithiasis
Archeological findings give profound evidence that humans have suffered from kidney and bladder stones for centuries. Bladder stones were more prevalent during older ages, but kidney stones became more prevalent during the past 100 years, at least in the more developed countries. Also, treatment options and conservative measures, as well as ‘surgical’ interventions have also been known for a long time. Our current preventive measures are definitively comparable to those of our predecessors. Stone removal, first lithotomy for bladder stones, followed by transurethral methods, was definitively painful and had severe side effects. Then, as now, the incidence of urolithiasis in a given population was dependent on the geographic area, racial distribution, socio-economic status and dietary habits. Changes in the latter factors during the past decades have affected the incidence and also the site and chemical composition of calculi, with calcium oxalate stones being now the most prevalent. Major differences in frequency of other constituents, particularly uric acid and struvite, reflect eating habits and infection risk factors specific to certain populations. Extensive epidemiological observations have emphasized the importance of nutritional factors in the pathogenesis of urolithiasis, and specific dietary advice is, nowadays, often the most appropriate for prevention and treatment of urolithiasis
Phosphate decreases urine calcium and increases calcium balance: A meta-analysis of the osteoporosis acid-ash diet hypothesis
<p>Abstract</p> <p>Background</p> <p>The acid-ash hypothesis posits that increased excretion of "acidic" ions derived from the diet, such as phosphate, contributes to net acidic ion excretion, urine calcium excretion, demineralization of bone, and osteoporosis. The public is advised by various media to follow an alkaline diet to lower their acidic ion intakes. The objectives of this meta-analysis were to quantify the contribution of phosphate to bone loss in healthy adult subjects; specifically, a) to assess the effect of supplemental dietary phosphate on urine calcium, calcium balance, and markers of bone metabolism; and to assess whether these affects are altered by the b) level of calcium intake, c) the degree of protonation of the phosphate.</p> <p>Methods</p> <p>Literature was identified through computerized searches regarding phosphate with surrogate and/or direct markers of bone health, and was assessed for methodological quality. Multiple linear regression analyses, weighted for sample size, were used to combine the study results. Tests of interaction included stratification by calcium intake and degree of protonation of the phosphate supplement.</p> <p>Results</p> <p>Twelve studies including 30 intervention arms manipulated 269 subjects' phosphate intakes. Three studies reported net acid excretion. All of the meta-analyses demonstrated significant decreases in urine calcium excretion in response to phosphate supplements whether the calcium intake was high or low, regardless of the degree of protonation of the phosphate supplement. None of the meta-analyses revealed lower calcium balance in response to increased phosphate intakes, whether the calcium intake was high or low, or the composition of the phosphate supplement.</p> <p>Conclusion</p> <p>All of the findings from this meta-analysis were contrary to the acid ash hypothesis. Higher phosphate intakes were associated with decreased urine calcium and increased calcium retention. This meta-analysis did not find evidence that phosphate intake contributes to demineralization of bone or to bone calcium excretion in the urine. Dietary advice that dairy products, meats, and grains are detrimental to bone health due to "acidic" phosphate content needs reassessment. There is no evidence that higher phosphate intakes are detrimental to bone health.</p
Comparative performances of machine learning methods for classifying Crohn Disease patients using genome-wide genotyping data
Abstract: Crohn Disease (CD) is a complex genetic disorder for which more than 140 genes have been identified using genome wide association studies (GWAS). However, the genetic architecture of the trait remains largely unknown. The recent development of machine learning (ML) approaches incited us to apply them to classify healthy and diseased people according to their genomic information. The Immunochip dataset containing 18,227 CD patients and 34,050 healthy controls enrolled and genotyped by the international Inflammatory Bowel Disease genetic consortium (IIBDGC) has been re-analyzed using a set of ML methods: penalized logistic regression (LR), gradient boosted trees (GBT) and artificial neural networks (NN). The main score used to compare the methods was the Area Under the ROC Curve (AUC) statistics. The impact of quality control (QC), imputing and coding methods on LR results showed that QC methods and imputation of missing genotypes may artificially increase the scores. At the opposite, neither the patient/control ratio nor marker preselection or coding strategies significantly affected the results. LR methods, including Lasso, Ridge and ElasticNet provided similar results with a maximum AUC of 0.80. GBT methods like XGBoost, LightGBM and CatBoost, together with dense NN with one or more hidden layers, provided similar AUC values, suggesting limited epistatic effects in the genetic architecture of the trait. ML methods detected near all the genetic variants previously identified by GWAS among the best predictors plus additional predictors with lower effects. The robustness and complementarity of the different methods are also studied. Compared to LR, non-linear models such as GBT or NN may provide robust complementary approaches to identify and classify genetic markers
- …