663 research outputs found
Acute kidney disease and renal recovery : consensus report of the Acute Disease Quality Initiative (ADQI) 16 Workgroup
Consensus definitions have been reached for both acute kidney injury (AKI) and chronic kidney disease (CKD) and these definitions are now routinely used in research and clinical practice. The KDIGO guideline defines AKI as an abrupt decrease in kidney function occurring over 7 days or less, whereas CKD is defined by the persistence of kidney disease for a period of > 90 days. AKI and CKD are increasingly recognized as related entities and in some instances probably represent a continuum of the disease process. For patients in whom pathophysiologic processes are ongoing, the term acute kidney disease (AKD) has been proposed to define the course of disease after AKI; however, definitions of AKD and strategies for the management of patients with AKD are not currently available. In this consensus statement, the Acute Disease Quality Initiative (ADQI) proposes definitions, staging criteria for AKD, and strategies for the management of affected patients. We also make recommendations for areas of future research, which aim to improve understanding of the underlying processes and improve outcomes for patients with AKD
Autoantibodies to Agrin in Myasthenia Gravis Patients
To determine if patients with myasthenia gravis (MG) have antibodies to agrin, a proteoglycan released by motor neurons and is critical for neuromuscular junction (NMJ) formation, we collected serum samples from 93 patients with MG with known status of antibodies to acetylcholine receptor (AChR), muscle specific kinase (MuSK) and lipoprotein-related 4 (LRP4) and samples from control subjects (healthy individuals and individuals with other diseases). Sera were assayed for antibodies to agrin. We found antibodies to agrin in 7 serum samples of MG patients. None of the 25 healthy controls and none of the 55 control neurological patients had agrin antibodies. Two of the four triple negative MG patients (i.e., no detectable AChR, MuSK or LRP4 antibodies, AChR-/MuSK-/LRP4-) had antibodies against agrin. In addition, agrin antibodies were detected in 5 out of 83 AChR+/MuSK-/LRP4- patients but were not found in the 6 patients with MuSK antibodies (AChR-/MuSK+/LRP4-). Sera from MG patients with agrin antibodies were able to recognize recombinant agrin in conditioned media and in transfected HEK293 cells. These sera also inhibited the agrin-induced MuSK phosphorylation and AChR clustering in muscle cells. Together, these observations indicate that agrin is another autoantigen in patients with MG and agrin autoantibodies may be pathogenic through inhibition of agrin/LRP4/MuSK signaling at the NMJ
Regulation of immunity during visceral Leishmania infection
Unicellular eukaryotes of the genus Leishmania are collectively responsible for a heterogeneous group of diseases known as leishmaniasis. The visceral form of leishmaniasis, caused by L. donovani or L. infantum, is a devastating condition, claiming 20,000 to 40,000 lives annually, with particular incidence in some of the poorest regions of the world. Immunity to Leishmania depends on the development of protective type I immune responses capable of activating infected phagocytes to kill intracellular amastigotes. However, despite the induction of protective responses, disease progresses due to a multitude of factors that impede an optimal response. These include the action of suppressive cytokines, exhaustion of specific T cells, loss of lymphoid tissue architecture and a defective humoral response. We will review how these responses are orchestrated during the course of infection, including both early and chronic stages, focusing on the spleen and the liver, which are the main target organs of visceral Leishmania in the host. A comprehensive understanding of the immune events that occur during visceral Leishmania infection is crucial for the implementation of immunotherapeutic approaches that complement the current anti-Leishmania chemotherapy and the development of effective vaccines to prevent disease.The research leading to these results has received funding from the European Community’s Seventh Framework Programme under grant agreement No.602773 (Project KINDRED). VR is supported by a post-doctoral fellowship granted by the KINDReD consortium. RS thanks the Foundation for Science and Technology (FCT) for an Investigator Grant (IF/00021/2014). This work was supported by grants to JE from ANR (LEISH-APO, France), Partenariat Hubert Curien (PHC) (program Volubilis, MA/11/262). JE acknowledges the support of the Canada Research Chair Program
Risk of chronic kidney disease after cancer nephrectomy.
The incidence of early stage renal cell carcinoma (RCC) is increasing and observational studies have shown equivalent oncological outcomes of partial versus radical nephrectomy for stage I tumours. Population studies suggest that compared with radical nephrectomy, partial nephrectomy is associated with decreased mortality and a lower rate of postoperative decline in kidney function. However, rates of chronic kidney disease (CKD) in patients who have undergone nephrectomy might be higher than in the general population. The risks of new-onset or accelerated CKD and worsened survival after nephrectomy might be linked, as kidney insufficiency is a risk factor for cardiovascular disease and mortality. Nephron-sparing approaches have, therefore, been proposed as the standard of care for patients with type 1a tumours and as a viable option for those with type 1b tumours. However, prospective data on the incidence of de novo and accelerated CKD after cancer nephrectomy is lacking, and the only randomized trial to date was closed prematurely. Intrinsic abnormalities in non-neoplastic kidney parenchyma and comorbid conditions (including diabetes mellitus and hypertension) might increase the risks of CKD and RCC. More research is needed to better understand the risk of CKD post-nephrectomy, to develop and validate predictive scores for risk-stratification, and to optimize patient management
Nutraceutical therapies for atherosclerosis
Atherosclerosis is a chronic inflammatory disease affecting large and medium arteries and is considered to be a major underlying cause of cardiovascular disease (CVD). Although the development of pharmacotherapies to treat CVD has contributed to a decline in cardiac mortality in the past few decades, CVD is estimated to be the cause of one-third of deaths globally. Nutraceuticals are natural nutritional compounds that are beneficial for the prevention or treatment of disease and, therefore, are a possible therapeutic avenue for the treatment of atherosclerosis. The purpose of this Review is to highlight potential nutraceuticals for use as antiatherogenic therapies with evidence from in vitro and in vivo studies. Furthermore, the current evidence from observational and randomized clinical studies into the role of nutraceuticals in preventing atherosclerosis in humans will also be discussed
A review of inorganic contaminants in Australian marine mammals, birds and turtles
Environmental context Metal concentrations can build up to potentially harmful levels in marine mammals as they are at the top of the food chain This review summarises the information available on metal concentrations in marine mammals birds and turtles from around Australia Despite large data gaps the available data suggest that metal concentrations are similar to those encountered in other regions of the world Abstract A comprehensive compilation of the published data for trace element concentrations metals and metalloids in Australian marine mammals birds and turtles is presented The majority of studies have relied on the utilisation of opportunistically collected samples animal strandings and bycatch This has resulted in large gaps in geographical temporal and species coverage data For instance little or no data are available for cetaceans in New South Wales or the Northern Territory and out of 14 endemic species of dolphins data only exist for seven species The aforementioned data gaps make it hard to identify statistically significant trends a problem compounded by data being reported in the form of ranges without raw data Trace element concentrations measured in various marine species and their tissue types are extremely variable with ranges typically spanning several orders of magnitude but are generally comparable with international data Trends in contaminant concentrations with tissue type follow generally accepted patterns of behaviour for higher organisms with the highest mercury concentrations in liver and cadmium in kidney tissues Herbivores have lower contaminant loadings than carnivores reflecting the importance of diet and there are identifiable age related trends for elements such as mercury The lack of supporting pathology on dead and stranded animals and data on specimens from uncontaminated locations restrict conclusions on organism health impacts There have been some attempts to use non invasive sampling of indicator tissues such as fur bristl
Accelarated immune ageing is associated with COVID-19 disease severity
Background: The striking increase in COVID-19 severity in older adults provides a clear example of immunesenescence, the age-related remodelling of the immune system. To better characterise the association between convalescent immunesenescence and acute disease severity, we determined the immune phenotype of COVID-19 survivors and non-infected controls. / Results: We performed detailed immune phenotyping of peripheral blood mononuclear cells isolated from 103 COVID-19 survivors 3–5 months post recovery who were classified as having had severe (n = 56; age 53.12 ± 11.30 years), moderate (n = 32; age 52.28 ± 11.43 years) or mild (n = 15; age 49.67 ± 7.30 years) disease and compared with age and sex-matched healthy adults (n = 59; age 50.49 ± 10.68 years). We assessed a broad range of immune cell phenotypes to generate a composite score, IMM-AGE, to determine the degree of immune senescence. We found increased immunesenescence features in severe COVID-19 survivors compared to controls including: a reduced frequency and number of naïve CD4 and CD8 T cells (p < 0.0001); increased frequency of EMRA CD4 (p < 0.003) and CD8 T cells (p < 0.001); a higher frequency (p < 0.0001) and absolute numbers (p < 0.001) of CD28−ve CD57+ve senescent CD4 and CD8 T cells; higher frequency (p < 0.003) and absolute numbers (p < 0.02) of PD-1 expressing exhausted CD8 T cells; a two-fold increase in Th17 polarisation (p < 0.0001); higher frequency of memory B cells (p < 0.001) and increased frequency (p < 0.0001) and numbers (p < 0.001) of CD57+ve senescent NK cells. As a result, the IMM-AGE score was significantly higher in severe COVID-19 survivors than in controls (p < 0.001). Few differences were seen for those with moderate disease and none for mild disease. Regression analysis revealed the only pre-existing variable influencing the IMM-AGE score was South Asian ethnicity (β = 0.174, p = 0.043), with a major influence being disease severity (β = 0.188, p = 0.01). / Conclusions: Our analyses reveal a state of enhanced immune ageing in survivors of severe COVID-19 and suggest this could be related to SARS-Cov-2 infection. Our data support the rationale for trials of anti-immune ageing interventions for improving clinical outcomes in these patients with severe disease
Blister Score: A Novel, Externally Validated Tool for Predicting Cardiac Implantable Electronic Device Infections, and Its Cost-Utility Implications for Antimicrobial Envelope Use
BACKGROUND: Antimicrobial envelopes reduce the incidence of cardiac implantable electronic device infections, but their cost restricts routine use in the United Kingdom. Risk scoring could help to identify which patients would most benefit from this technology. METHODS: A novel risk score (BLISTER) was derived from multivariate analysis of factors associated with cardiac implantable electronic device infection. Diagnostic utility was assessed against the existing PADIT score in both standard and high-risk external validation cohorts, and cost-utility models examined different BLISTER and PADIT score thresholds for TYRX antimicrobial envelope allocation. RESULTS: In a derivation cohort (n=7383), cardiac implantable electronic device infection occurred in 59 individuals within 12 months of a procedure (event rate, 0.8%). In addition to the PADIT score constituents, lead extraction (hazard ratio, 3.3 [95% CI, 1.9-6.1]; P50 mg/L (hazard ratio, 3.0 [95% CI, 1.4-6.4]; P=0.005), reintervention within 2 years (hazard ratio, 10.1 [95% CI, 5.6-17.9]; P<0.0001), and top-quartile procedure duration (hazard ratio, 2.6 [95% CI, 1.6-4.1]; P=0.001) were independent predictors of infection. The BLISTER score demonstrated superior discriminative performance versus PADIT in the standard risk (n=2854, event rate: 0.8%, area under the curve, 0.82 versus 0.71; P=0.001) and high-risk validation cohorts (n=1961, event rate: 2.0%, area under the curve, 0.77 versus 0.69; P=0.001), and in all patients (n=12 198, event rate: 1%, area under the curve, 0.8 versus 0.75, P=0.002). In decision-analytic modeling, the optimum scenario assigned antimicrobial envelopes to patients with BLISTER scores ≥6 (10.8%), delivering a significant reduction in infections (relative risk reduction, 30%; P=0.036) within the National Institute for Health and Care Excellence cost-utility thresholds (ICER, £18 446). CONCLUSIONS: The BLISTER score (https://qxmd.com/calculate/calculator_876/the-blister-score-for-cied-infection) was a valid predictor of cardiac implantable electronic device infection, and could facilitate cost-effective antimicrobial envelope allocation to high-risk patients
Time spent on work-related activities, social activities and time pressure as intermediary determinants of health disparities among elderly women and men in 5 European countries: a structural equation model
Background Psychosocial factors shape the health of older adults through complex inter-relating pathways. Besides socioeconomic factors, time use activities may explain gender inequality in self-reported health. This study investigated the role of work-related and social time use activities as determinants of health in old age. Specifically, we analysed whether the impact of stress in terms of time pressure on health mediated the relationship between work-related time use activities (i.e. housework and paid work) on self-reported health. Methods We applied structural equation models and a maximum-likelihood function to estimate the direct and indirect effects of psychosocial factors on health using pooled data from the Multinational Time Use Study on 11,168 men and 14,295 women aged 65+ from Italy, Spain, UK, France and the Netherlands. Results The fit indices for the conceptual model indicated an acceptable fit for both men and women. The results showed that socioeconomic status (SES), demographic factors, stress and work-related time use activities after retirement had a significant direct influence on self-reported health among the elderly, but the magnitude of the effects varied by gender. Social activities had a positive impact on self-reported health but had no significant impact on stress among older men and women. The indirect standardized effects of work-related activities on self-reported health was statistically significant for housework (β = − 0.006; P 0.05 among women), which implied that the paths from paid work and housework on self-reported health via stress (mediator) was very weak because their indirect effects were close to zero. Conclusions Our findings suggest that although stress in terms of time pressure has a direct negative effect on health, it does not indirectly influence the positive effects of work-related time use activities on self-reported health among elderly men and women. The results support the time availability hypothesis that the elderly may not have the same time pressure as younger adults after retirement
- …
