3,777 research outputs found

    Association between urinary sodium, creatinine, albumin, and long term survival in chronic kidney disease

    Get PDF
    Dietary sodium intake is associated with hypertension and cardiovascular risk in the general population. In patients with chronic kidney disease, sodium intake has been associated with progressive renal disease, but not independently of proteinuria. We studied the relationship between urinary sodium excretion and urinary sodium:creatinine ratio and mortality or requirement for renal replacement therapy in chronic kidney disease. Adults attending a renal clinic who had at least one 24-hour urinary sodium measurement were identified. 24-hour urinary sodium measures were collected and urinary sodium:creatinine ratio calculated. Time to renal replacement therapy or death was recorded. 423 patients were identified with mean estimated glomerular filtration rate of 48ml/min/1.73m<sup>2</sup>. 90 patients required renal replacement therapy and 102 patients died. Mean slope decline in estimated glomerular filtration rate was -2.8ml/min/1.73m<sup>2</sup>/year. Median follow-up was 8.5 years. Patients who died or required renal replacement therapy had significantly higher urinary sodium excretion and urinary sodium:creatinine but the association with these parameters and poor outcome was not independent of renal function, age and albuminuria. When stratified by albuminuria, urinary sodium:creatinine was a significant cumulative additional risk for mortality, even in patients with low level albuminuria. There was no association between low urinary sodium and risk, as observed in some studies. This study demonstrates an association between urinary sodium excretion and mortality in chronic kidney disease, with a cumulative relationship between sodium excretion, albuminuria and reduced survival. These data support reducing dietary sodium intake in chronic kidney disease but further study is required to determine the target sodium intake

    Effect of mineralocorticoid receptor antagonists on proteinuria and progression of chronic kidney disease: a systematic review and meta-analysis

    Get PDF
    Background: Hypertension and proteinuria are critically involved in the progression of chronic kidney disease. Despite treatment with renin angiotensin system inhibition, kidney function declines in many patients. Aldosterone excess is a risk factor for progression of kidney disease. Hyperkalaemia is a concern with the use of mineralocorticoid receptor antagonists. We aimed to determine whether the renal protective benefits of mineralocorticoid antagonists outweigh the risk of hyperkalaemia associated with this treatment in patients with chronic kidney disease. Methods: We conducted a meta-analysis investigating renoprotective effects and risk of hyperkalaemia in trials of mineralocorticoid receptor antagonists in chronic kidney disease. Trials were identified from MEDLINE (1966–2014), EMBASE (1947–2014) and the Cochrane Clinical Trials Database. Unpublished summary data were obtained from investigators. We included randomised controlled trials, and the first period of randomised cross over trials lasting ≥4 weeks in adults. Results: Nineteen trials (21 study groups, 1 646 patients) were included. In random effects meta-analysis, addition of mineralocorticoid receptor antagonists to renin angiotensin system inhibition resulted in a reduction from baseline in systolic blood pressure (−5.7 [−9.0, −2.3] mmHg), diastolic blood pressure (−1.7 [−3.4, −0.1] mmHg) and glomerular filtration rate (−3.2 [−5.4, −1.0] mL/min/1.73 m2). Mineralocorticoid receptor antagonism reduced weighted mean protein/albumin excretion by 38.7 % but with a threefold higher relative risk of withdrawing from the trial due to hyperkalaemia (3.21, [1.19, 8.71]). Death, cardiovascular events and hard renal end points were not reported in sufficient numbers to analyse. Conclusions: Mineralocorticoid receptor antagonism reduces blood pressure and urinary protein/albumin excretion with a quantifiable risk of hyperkalaemia above predefined study upper limit

    Quantifying single nucleotide variant detection sensitivity in exome sequencing

    Get PDF
    BACKGROUND: The targeted capture and sequencing of genomic regions has rapidly demonstrated its utility in genetic studies. Inherent in this technology is considerable heterogeneity of target coverage and this is expected to systematically impact our sensitivity to detect genuine polymorphisms. To fully interpret the polymorphisms identified in a genetic study it is often essential to both detect polymorphisms and to understand where and with what probability real polymorphisms may have been missed. RESULTS: Using down-sampling of 30 deeply sequenced exomes and a set of gold-standard single nucleotide variant (SNV) genotype calls for each sample, we developed an empirical model relating the read depth at a polymorphic site to the probability of calling the correct genotype at that site. We find that measured sensitivity in SNV detection is substantially worse than that predicted from the naive expectation of sampling from a binomial. This calibrated model allows us to produce single nucleotide resolution SNV sensitivity estimates which can be merged to give summary sensitivity measures for any arbitrary partition of the target sequences (nucleotide, exon, gene, pathway, exome). These metrics are directly comparable between platforms and can be combined between samples to give “power estimates” for an entire study. We estimate a local read depth of 13X is required to detect the alleles and genotype of a heterozygous SNV 95% of the time, but only 3X for a homozygous SNV. At a mean on-target read depth of 20X, commonly used for rare disease exome sequencing studies, we predict 5–15% of heterozygous and 1–4% of homozygous SNVs in the targeted regions will be missed. CONCLUSIONS: Non-reference alleles in the heterozygote state have a high chance of being missed when commonly applied read coverage thresholds are used despite the widely held assumption that there is good polymorphism detection at these coverage levels. Such alleles are likely to be of functional importance in population based studies of rare diseases, somatic mutations in cancer and explaining the “missing heritability” of quantitative traits

    Cognitive decline and quality of life in incident Parkinson's disease: The role of attention.

    Get PDF
    INTRODUCTION: Parkinson's disease dementia (PDD) is associated with poorer quality of life (QoL). Prior to the onset of PDD, many patients experience progressive cognitive impairment. There is a paucity of longitudinal studies investigating the effects of cognitive decline on QoL. This study aimed to determine the longitudinal impact of cognitive change on QoL in an incident PD cohort. METHODS: Recently diagnosed patients with PD (n = 212) completed a schedule of neuropsychological assessments and QoL measures; these were repeated after 18 (n = 190) and 36 months (n = 158). Mild cognitive impairment (PD-MCI) was classified with reference to the Movement Disorder Society criteria. Principal component analysis was used to reduce 10 neuropsychological tests to three cognitive factors: attention, memory/executive function, and global cognition. RESULTS: Baseline PD-MCI was a significant contributor to QoL (β = 0.2, p < 0.01). For those subjects (9%) who developed dementia, cognitive function had a much greater impact on QoL (β = 10.3, p < 0.05). Multivariate modelling showed attentional deficits had the strongest predictive power (β = -2.3, p < 0.01); brief global tests only modestly predicted decline in QoL (β = -0.4, p < 0.01). CONCLUSIONS: PD-MCI was associated with poorer QoL over three years follow up. Cognitive impairment had a greater impact on QoL in individuals who developed dementia over follow-up. Impaired attention was a significant determinant of QoL in PD. Interventions which improve concentration and attention in those with PD could potentially improve QoL

    Global gene expression analysis of human erythroid progenitors

    Get PDF
    This article is available open access through the publisher’s website. Copyright @ 2011 American Society of Hematology. This article has an erratum: http://bloodjournal.hematologylibrary.org/content/118/26/6993.3.Understanding the pattern of gene expression during erythropoiesis is crucial for a synthesis of erythroid developmental biology. Here, we isolated 4 distinct populations at successive erythropoietin-dependent stages of erythropoiesis, including the terminal, pyknotic stage. The transcriptome was determined using Affymetrix arrays. First, we demonstrated the importance of using defined cell populations to identify lineage and temporally specific patterns of gene expression. Cells sorted by surface expression profile not only express significantly fewer genes than unsorted cells but also demonstrate significantly greater differences in the expression levels of particular genes between stages than unsorted cells. Second, using standard software, we identified more than 1000 transcripts not previously observed to be differentially expressed during erythroid maturation, 13 of which are highly significantly terminally regulated, including RFXAP and SMARCA4. Third, using matched filtering, we identified 12 transcripts not previously reported to be continuously up-regulated in maturing human primary erythroblasts. Finally, using transcription factor binding site analysis, we identified potential transcription factors that may regulate gene expression during terminal erythropoiesis. Our stringent lists of differentially regulated and continuously expressed transcripts containing many genes with undiscovered functions in erythroblasts are a resource for future functional studies of erythropoiesis. Our Human Erythroid Maturation database is available at https://cellline.molbiol.ox.ac.uk/eryth/index.html.National Health Service Blood and Transplant, National Institute for Health Research Biomedical Research Center Program, and National Institute for Health Research

    Screening for pickiness - a validation study

    Get PDF
    Picky eating is prevalent in childhood and is associated with negative health outcomes. Therefore early detection of pickiness is pertinent. Because no psychometric measure of picky/fussy eating has been validated, we aimed to examine the screening efficiency of the 6-item ‘Food Fussiness’ (FF) scale from the Children’s Eating Behavior Questionnaire using structured psychiatric interviews (the Preschool Age Psychiatric Interview), providing meaningful cut-off values based on a large, representative sample of Norwegian 6 year olds (n = 752). Screening efficiency was evaluated using receiver operating characteristic curve analysis, revealing excellent discrimination. The cut-point maximizing the sum of sensitivity and specificity for the scale was found at a score of 3.33 for severe cases and 3.00 when both moderate and severe pickiness were included. The results suggest that the FF scale may provide a tool for identification of clinically significant picky eating, although further assessment may be needed to separate moderate from severe cases

    Sex differences in intraorgan fat levels and hepatic lipid metabolism: implications for cardiovascular health and remission of type 2 diabetes after dietary weight loss

    Get PDF
    Aims/hypothesis: Type 2 diabetes confers a greater relative increase in CVD risk in women compared with men. We examined sex differences in intraorgan fat and hepatic VLDL1-triacylglycerol (VLDL1-TG) export before and after major dietary weight loss. Methods: A group with type 2 diabetes (n = 64, 30 male/34 female) and a group of healthy individuals (n = 25, 13 male/12 female) were studied. Intraorgan and visceral fat were quantified by magnetic resonance and VLDL1-TG export by intralipid infusion techniques. Results: Triacylglycerol content of the liver and pancreas was elevated in people with diabetes with no sex differences (liver 16.4% [9.3–25.0%] in women vs 11.9% [7.0–23.1%] in men, p = 0.57, and pancreas 8.3 ± 0.5% vs 8.5 ± 0.4%, p = 0.83, respectively). In the absence of diabetes, fat levels in both organs were lower in women than men (1.0% [0.9–1.7%] vs 4.5% [1.9–8.0%], p = 0.005, and 4.7 ± 0.4% vs 7.6 ± 0.5%, p&lt; 0.0001, respectively). Women with diabetes had higher hepatic VLDL1-TG production rate and plasma VLDL1-TG than healthy women (559.3 ± 32.9 vs 403.2 ± 45.7 mg kg−1 day−1, p = 0.01, and 0.45 [0.26–0.77] vs 0.25 [0.13–0.33] mmol/l, p = 0.02), whereas there were no differences in men (548.8 ± 39.8 vs 506.7 ± 29.2 mg kg−1 day−1, p = 0.34, and 0.72 [0.53–1.15] vs 0.50 [0.32–0.68] mmol/l, p = 0.26). Weight loss decreased intraorgan fat and VLDL1-TG production rates regardless of sex, and these changes were accompanied by similar rates of diabetes remission (65.4% vs 71.0%) and CVD risk reduction (59.8% vs 41.5%) in women and men, respectively. Conclusions/interpretation: In type 2 diabetes, women have liver and pancreas fat levels as high as those of men, associated with raised hepatic VLDL1-TG production rates. Dynamics of triacylglycerol turnover differ between sexes in type 2 diabetes and following weight loss. These changes may contribute to the disproportionately raised cardiovascular risk of women with diabetes

    Test-Retest Variability and Discriminatory Power of Measurements From Microperimetry and Dark Adaptation Assessment in People With Intermediate Age-Related Macular Degeneration – A MACUSTAR Study Report

    Get PDF
    Purpose: The purpose of this study was to assess test-retest variability and discriminatory power of measures from macular integrity assessment (S-MAIA) and AdaptDx. // Methods: This is a cross-sectional study of 167 people with intermediate age-related macular degeneration (iAMD), no AMD (controls; n = 54), early AMD (n = 28), and late AMD (n = 41), recruited across 18 European ophthalmology centers. Repeat measures of mesopic and scotopic S-MAIA average (mean) threshold (MMAT decibels [dB] and SMAT [dB]) and rod intercept time (RIT [mins]) at 2 visits 14 (±7) days apart were recorded. Repeat measures were assessed by Bland-Altman analysis, intra-class correlation coefficients (ICCs) and variability ratios. Secondary analysis assessed the area under the receiver operating characteristic curves (AUC) to determine the ability to distinguish people as having no AMD, early AMD, or iAMD. // Results: Data were available for 128, 131, and 103 iAMD participants for the mesopic and scotopic S-MAIA and AdaptDx, respectively. MMAT and SMAT demonstrate similar test-retest variability in iAMD (95% confidence interval [CI] ICC of 0.79–0.89 and 0.78–0.89, respectively). ICCs were worse in RIT (95% CI ICC = 0.55–0.77). All tests had equivalent AUCs (approximately 70%) distinguishing between subjects with iAMD and controls, whereas early AMD was indistinguishable from iAMD on all measures (AUC = <55%). A learning effect was not seen in these assessments under the operating procedures used. // Conclusions: MMAT, SMAT, and RIT have adequate test-retest variability and are all moderately good at separating people with iAMD from controls. // Translational Relevance: Expected levels of test-retest variability and discriminatory power of the AdaptDx and MAIA devices in a clinical study setting must be considered when designing future trials for people with AMD
    corecore