49 research outputs found
Proteinuria and albuminuria at point of care
Proteinuria is a key diagnostic and pathophysiological aspect of kidney dysfunction, influencing the progression of kidney and systemic diseases. Both general practitioners and specialists should be able to discriminate the relevance of proteinuria, starting from a urine sample, and eventually referring selected patients to a nephrologist for further diagnostic workup and treatment, because most kidney diseases are not symptomatic until renal function is lost or severely compromised. As the interpretation of proteinuria is dependent on the method used to detect it, the aim of this article was to review laboratory and point-of-care diagnostic methods for proteinuria in different settings, such as the prevention and follow-up of common chronic diseases (i.e., hypertension, diabetes, chronic kidney disease). Urine dipsticks remain the most widely used method for detecting proteinuria, although different types of proteinuria, extreme pH values and urine concentration may affect their results. Albumin to creatinine ratio and protein to creatinine ratio performed on a spot urine sample are reliable tests that can effectively replace 24-hour urine collection analysis in clinical practice
Association of Variants in the SPTLC1 Gene With Juvenile Amyotrophic Lateral Sclerosis
Importance: Juvenile amyotrophic lateral sclerosis (ALS) is a rare form of ALS characterized by age of symptom onset less than 25 years and a variable presentation.Objective: To identify the genetic variants associated with juvenile ALS.Design, Setting, and Participants: In this multicenter family-based genetic study, trio whole-exome sequencing was performed to identify the disease-associated gene in a case series of unrelated patients diagnosed with juvenile ALS and severe growth retardation. The patients and their family members were enrolled at academic hospitals and a government research facility between March 1, 2016, and March 13, 2020, and were observed until October 1, 2020. Whole-exome sequencing was also performed in a series of patients with juvenile ALS. A total of 66 patients with juvenile ALS and 6258 adult patients with ALS participated in the study. Patients were selected for the study based on their diagnosis, and all eligible participants were enrolled in the study. None of the participants had a family history of neurological disorders, suggesting de novo variants as the underlying genetic mechanism.Main Outcomes and Measures: De novo variants present only in the index case and not in unaffected family members.Results: Trio whole-exome sequencing was performed in 3 patients diagnosed with juvenile ALS and their parents. An additional 63 patients with juvenile ALS and 6258 adult patients with ALS were subsequently screened for variants in the SPTLC1 gene. De novo variants in SPTLC1 (p.Ala20Ser in 2 patients and p.Ser331Tyr in 1 patient) were identified in 3 unrelated patients diagnosed with juvenile ALS and failure to thrive. A fourth variant (p.Leu39del) was identified in a patient with juvenile ALS where parental DNA was unavailable. Variants in this gene have been previously shown to be associated with autosomal-dominant hereditary sensory autonomic neuropathy, type 1A, by disrupting an essential enzyme complex in the sphingolipid synthesis pathway.Conclusions and Relevance: These data broaden the phenotype associated with SPTLC1 and suggest that patients presenting with juvenile ALS should be screened for variants in this gene.</p
ALS-associated missense and nonsense TBK1 mutations can both cause loss of kinase function
Mutations in TBK1 have been linked to amyotrophic lateral sclerosis (ALS). Some TBK1 variants are nonsense and are predicted to cause disease through haploinsufficiency, however many other mutations are missense with unknown functional effect. We exome sequenced 699 familial ALS patients and identified 16 TBK1 novel or extremely rare protein changing variants. We characterised a subset of these: p.G217R, p.R357X and p.C471Y. Here we show that the p.R357X and p.G217R both abolish the ability of TBK1 to phosphorylate two of its kinase targets, IRF3 and OPTN and to undergo phosphorylation. They both inhibit binding to OPTN and the p.G217R, within the TBK1 kinase domain, reduces homodimerisation, essential for TBK1 activation and function. Lastly, we show that the proportion TBK1 that is active (phosphorylated) is reduced in five lymphoblastoid cell lines derived from patients harbouring heterozygous missense or in-frame deletion TBK1 mutations. We conclude that missense mutations in functional domains of TBK1 impair the binding and phosphorylation of its normal targets, implicating a common loss of function mechanism, analogous to truncation mutations
Measuring total factor productivity on Irish dairy farms: a Fisher index approach using farm-level data
peer reviewedThis paper presents a Fisher index measure of the total factor productivity (TFP) performance of Irish dairy farms
over the period 2006–2016 using the Teagasc National Farm Survey (NFS) data. The removal of milk quotas in 2015
has led to an increase of over 30% in dairy cow numbers since 2010, and although suckler cow numbers have
dropped slightly, the total number of cows in Ireland reached an all-time high of 2.5 million head in 2016. This large
increase adds to the environmental pressures attributed to agricultural output and puts the focus firmly on how
efficiently the additional agricultural output associated with higher cow numbers is produced. The primary purpose
of this paper is to identify a standardised measure of the TFP performance of Irish dairy farms that can be routinely
updated using Teagasc NFS data. We found that relative to 2010 the TFP of Irish dairy farms has increased by
almost 18%; however, in one production year 2015, when milk quota was removed, the TFP measure increased by
7% and TFP continued to grow by 2.5% in the production year 2016. It would seem therefore that the removal of the
European dairy quota system has resulted in a windfall gain for Irish dairy farmers but that productivity gains are
continuing. Future data will be required to investigate the longer-term TFP performance of Irish dairy farms in the
post-milk quota era
Hyperoxemia and excess oxygen use in early acute respiratory distress syndrome : Insights from the LUNG SAFE study
Publisher Copyright: © 2020 The Author(s). Copyright: Copyright 2020 Elsevier B.V., All rights reserved.Background: Concerns exist regarding the prevalence and impact of unnecessary oxygen use in patients with acute respiratory distress syndrome (ARDS). We examined this issue in patients with ARDS enrolled in the Large observational study to UNderstand the Global impact of Severe Acute respiratory FailurE (LUNG SAFE) study. Methods: In this secondary analysis of the LUNG SAFE study, we wished to determine the prevalence and the outcomes associated with hyperoxemia on day 1, sustained hyperoxemia, and excessive oxygen use in patients with early ARDS. Patients who fulfilled criteria of ARDS on day 1 and day 2 of acute hypoxemic respiratory failure were categorized based on the presence of hyperoxemia (PaO2 > 100 mmHg) on day 1, sustained (i.e., present on day 1 and day 2) hyperoxemia, or excessive oxygen use (FIO2 ≥ 0.60 during hyperoxemia). Results: Of 2005 patients that met the inclusion criteria, 131 (6.5%) were hypoxemic (PaO2 < 55 mmHg), 607 (30%) had hyperoxemia on day 1, and 250 (12%) had sustained hyperoxemia. Excess FIO2 use occurred in 400 (66%) out of 607 patients with hyperoxemia. Excess FIO2 use decreased from day 1 to day 2 of ARDS, with most hyperoxemic patients on day 2 receiving relatively low FIO2. Multivariate analyses found no independent relationship between day 1 hyperoxemia, sustained hyperoxemia, or excess FIO2 use and adverse clinical outcomes. Mortality was 42% in patients with excess FIO2 use, compared to 39% in a propensity-matched sample of normoxemic (PaO2 55-100 mmHg) patients (P = 0.47). Conclusions: Hyperoxemia and excess oxygen use are both prevalent in early ARDS but are most often non-sustained. No relationship was found between hyperoxemia or excessive oxygen use and patient outcome in this cohort. Trial registration: LUNG-SAFE is registered with ClinicalTrials.gov, NCT02010073publishersversionPeer reviewe
A model for predicting the risk of developing mild anemia (MA) in patients with lymphoid malignancy. A study of the Gruppo Italiano Studio Linfomi (GISL)
Background: So far no predictive model has been defined to predict the risk of developing MA in cancer patients.Methods and objectives: All cases registered from 1991 to 2005 in one of GISL clinical trials, with a confirmed diagnosis of Aggressive lymphoma (AL), Follicular lymphoma (FL) or Hodgkin’s lymphoma (HL), were selected for the study. Patients should have available data on clinical features at diagnosis, treatment details and hematological toxicity. MA was defined as the presence of baseline Hb levels below11 g/dl or grade 1–4 hemoglobin toxicity defined according to WHO criteria. Theaim of the study was to develop a model to predict the risk of developing MA forpatients with Lymphoid Malignancy (LM).Results: One thousand eight hundred and seventy four patients were included inthe study: AL, FL and HL was the diagnosis in 830, 218 and 687 case respectively;median age was 54, 57 and 32 years for AL, FL and HL respectively and thefrequency of female patients was 43%, 48% and 49% for the three groups. MedianHb level at diagnosis was 12.9 g/dl, 13.2 g/dl and 12.3 g/dl for AL, FL and HLrespectively with 20% of patients with showing MA. All patients received an adequateanthracycline based CT regimen. Overall 38% of patients with Hb baseline levels above 11 g/dl developed MA during chemotherapy. CT regimens were divided intothree groups based on MA risk (group 1: MA risk 0 to 25%; group 2: MA risk 26% to50%; group 3; MA risk above 50%). In the Logit multivariate analysis Age (above 55yrs), Female gender, CT groups (2–3 versus 1) Hb levels and bulky disease were factorsindependently correlated with the risk of developing MA. A prediction model wasdefined using previously defined parameters which allowed the identification of threegroups of patient with a different risk of MA. Patients at low, intermediate-Low,Low-High and high risk had a probability of MA of 5%, 25%, 38% and 52%respectively (P < 0.0001).Conclusion: The risk of developingMA in patients with LM undergoing chemotherapycan be predicted with a simple assessment of clinical and treatment features. Patientsat high risk of MA could be considered for anemia prevention programs
REST-Dependent Presynaptic Homeostasis Induced by Chronic Neuronal Hyperactivity
Homeostatic plasticity is a regulatory feedback response in which either synaptic strength or intrinsic excitability can be adjusted up or down to offset sustained changes in neuronal activity. Although a growing number of evidences constantly provide new insights into these two apparently distinct homeostatic processes, a unified molecular model remains unknown. We recently demonstrated that REST is a transcriptional repressor critical for the downscaling of intrinsic excitability in cultured hippocampal neurons subjected to prolonged elevation of electrical activity. Here, we report that, in the same experimental system, REST also participates in synaptic homeostasis by reducing the strength of excitatory synapses by specifically acting at the presynaptic level. Indeed, chronic hyperactivity triggers a REST-dependent decrease of the size of synaptic vesicle pools through the transcriptional and translational repression of specific presynaptic REST target genes. Together with our previous report, the data identify REST as a fundamental molecular player for neuronal homeostasis able to downscale simultaneously both intrinsic excitability and presynaptic efficiency in response to elevated neuronal activity. This experimental evidence adds new insights to the complex activity-dependent transcriptional regulation of the homeostatic plasticity processes mediated by REST