32 research outputs found
Recommended from our members
Differences in Proinflammatory Cytokines and Monocyte Subtypes in Older as Compared With Younger Kidney Transplant Recipients.
Background:The number of elderly patients with end-stage kidney disease requiring kidney transplantation continues to grow. Evaluation of healthy older adults has revealed proinflammatory changes in the immune system, which are posited to contribute to age-associated illnesses via "inflamm-aging." Immunologic dysfunction is also associated with impaired control of infections. Whether these immunologic changes are found in older kidney transplant recipients is not currently known, but may have important implications for risk for adverse clinical outcomes. Methods:Three months after transplant, innate immune phenotype was evaluated by flow cytometry from 60 kidney transplant recipients (22 older [≥60 years] and 38 younger [<60 years old]). Multiplex cytokine testing was used to evaluate plasma cytokine levels. Younger patients were matched to older patients based on transplant type and induction immune suppression. Results:Older kidney transplant recipients demonstrated decreased frequency of intermediate monocytes (CD14++CD16+) compared with younger patients (1.2% vs 3.3%, P = 0.007), and a trend toward increased frequency of proinflammatory classical monocytes (CD14++CD16-) (94.5% vs 92.1%) (P = 0.065). Increased levels of interferon-gamma (IFN-γ) were seen in older patients. Conclusions:In this pilot study of kidney transplant recipients, we identified differences in the innate immune system in older as compared with younger patients, including increased levels of IFN-γ. This suggests that age-associated nonspecific inflammation persists despite immune suppression. The ability to apply noninvasive testing to transplant recipients will provide tools for patient risk stratification and individualization of immune suppression regimens to improve outcomes after transplantation
Antimicrobial resistance among migrants in Europe: a systematic review and meta-analysis
BACKGROUND: Rates of antimicrobial resistance (AMR) are rising globally and there is concern that increased migration is contributing to the burden of antibiotic resistance in Europe. However, the effect of migration on the burden of AMR in Europe has not yet been comprehensively examined. Therefore, we did a systematic review and meta-analysis to identify and synthesise data for AMR carriage or infection in migrants to Europe to examine differences in patterns of AMR across migrant groups and in different settings. METHODS: For this systematic review and meta-analysis, we searched MEDLINE, Embase, PubMed, and Scopus with no language restrictions from Jan 1, 2000, to Jan 18, 2017, for primary data from observational studies reporting antibacterial resistance in common bacterial pathogens among migrants to 21 European Union-15 and European Economic Area countries. To be eligible for inclusion, studies had to report data on carriage or infection with laboratory-confirmed antibiotic-resistant organisms in migrant populations. We extracted data from eligible studies and assessed quality using piloted, standardised forms. We did not examine drug resistance in tuberculosis and excluded articles solely reporting on this parameter. We also excluded articles in which migrant status was determined by ethnicity, country of birth of participants' parents, or was not defined, and articles in which data were not disaggregated by migrant status. Outcomes were carriage of or infection with antibiotic-resistant organisms. We used random-effects models to calculate the pooled prevalence of each outcome. The study protocol is registered with PROSPERO, number CRD42016043681. FINDINGS: We identified 2274 articles, of which 23 observational studies reporting on antibiotic resistance in 2319 migrants were included. The pooled prevalence of any AMR carriage or AMR infection in migrants was 25·4% (95% CI 19·1-31·8; I2 =98%), including meticillin-resistant Staphylococcus aureus (7·8%, 4·8-10·7; I2 =92%) and antibiotic-resistant Gram-negative bacteria (27·2%, 17·6-36·8; I2 =94%). The pooled prevalence of any AMR carriage or infection was higher in refugees and asylum seekers (33·0%, 18·3-47·6; I2 =98%) than in other migrant groups (6·6%, 1·8-11·3; I2 =92%). The pooled prevalence of antibiotic-resistant organisms was slightly higher in high-migrant community settings (33·1%, 11·1-55·1; I2 =96%) than in migrants in hospitals (24·3%, 16·1-32·6; I2 =98%). We did not find evidence of high rates of transmission of AMR from migrant to host populations. INTERPRETATION: Migrants are exposed to conditions favouring the emergence of drug resistance during transit and in host countries in Europe. Increased antibiotic resistance among refugees and asylum seekers and in high-migrant community settings (such as refugee camps and detention facilities) highlights the need for improved living conditions, access to health care, and initiatives to facilitate detection of and appropriate high-quality treatment for antibiotic-resistant infections during transit and in host countries. Protocols for the prevention and control of infection and for antibiotic surveillance need to be integrated in all aspects of health care, which should be accessible for all migrant groups, and should target determinants of AMR before, during, and after migration. FUNDING: UK National Institute for Health Research Imperial Biomedical Research Centre, Imperial College Healthcare Charity, the Wellcome Trust, and UK National Institute for Health Research Health Protection Research Unit in Healthcare-associated Infections and Antimictobial Resistance at Imperial College London
Surgical site infection after gastrointestinal surgery in high-income, middle-income, and low-income countries: a prospective, international, multicentre cohort study
Background: Surgical site infection (SSI) is one of the most common infections associated with health care, but its importance as a global health priority is not fully understood. We quantified the burden of SSI after gastrointestinal surgery in countries in all parts of the world.
Methods: This international, prospective, multicentre cohort study included consecutive patients undergoing elective or emergency gastrointestinal resection within 2-week time periods at any health-care facility in any country. Countries with participating centres were stratified into high-income, middle-income, and low-income groups according to the UN's Human Development Index (HDI). Data variables from the GlobalSurg 1 study and other studies that have been found to affect the likelihood of SSI were entered into risk adjustment models. The primary outcome measure was the 30-day SSI incidence (defined by US Centers for Disease Control and Prevention criteria for superficial and deep incisional SSI). Relationships with explanatory variables were examined using Bayesian multilevel logistic regression models. This trial is registered with ClinicalTrials.gov, number NCT02662231.
Findings: Between Jan 4, 2016, and July 31, 2016, 13 265 records were submitted for analysis. 12 539 patients from 343 hospitals in 66 countries were included. 7339 (58·5%) patient were from high-HDI countries (193 hospitals in 30 countries), 3918 (31·2%) patients were from middle-HDI countries (82 hospitals in 18 countries), and 1282 (10·2%) patients were from low-HDI countries (68 hospitals in 18 countries). In total, 1538 (12·3%) patients had SSI within 30 days of surgery. The incidence of SSI varied between countries with high (691 [9·4%] of 7339 patients), middle (549 [14·0%] of 3918 patients), and low (298 [23·2%] of 1282) HDI (p < 0·001). The highest SSI incidence in each HDI group was after dirty surgery (102 [17·8%] of 574 patients in high-HDI countries; 74 [31·4%] of 236 patients in middle-HDI countries; 72 [39·8%] of 181 patients in low-HDI countries). Following risk factor adjustment, patients in low-HDI countries were at greatest risk of SSI (adjusted odds ratio 1·60, 95% credible interval 1·05–2·37; p=0·030). 132 (21·6%) of 610 patients with an SSI and a microbiology culture result had an infection that was resistant to the prophylactic antibiotic used. Resistant infections were detected in 49 (16·6%) of 295 patients in high-HDI countries, in 37 (19·8%) of 187 patients in middle-HDI countries, and in 46 (35·9%) of 128 patients in low-HDI countries (p < 0·001).
Interpretation: Countries with a low HDI carry a disproportionately greater burden of SSI than countries with a middle or high HDI and might have higher rates of antibiotic resistance. In view of WHO recommendations on SSI prevention that highlight the absence of high-quality interventional research, urgent, pragmatic, randomised trials based in LMICs are needed to assess measures aiming to reduce this preventable complication
Recommended from our members
Pre-emptive IVIG for Donor Specific Antibody Positive Living Donor Kidney Transplant Recipients
Background: Intravenous immunoglobulin (IVIG) is used alone and in combination with other therapies in desensitization regimens to modulate anti-human leukocyte antigen (HLA) donor-specific antibody (DSA) and facilitate transplantation of sensitized patients; however, the risk of acute antibody mediated rejection (ABMR) remains high (40-60%) and long-term graft survival remains unclear. Objective: The aim of the study is to evaluate the impact of a single high dose IVIG protocol on the incidence of acute rejection and long-term graft survival in living donor kidney transplant (LDKT) recipients with preformed HLA DSA. Methods: We retrospectively evaluated 663 adult (LDKT) recipients transplanted at our institution with and without preformed DSA between 2005-2013 with median follow-up of 5 years. We analyzed recipients with preformed DSA that received a single high dose of IVIG (2g/kg) at the time of transplant according to DSA Class (I, II or both) and compared them to each other and to DSA negative patients and patients with historic DSA (detected 6 months preceding transplantation). Rates of acute rejection and renal allograft survival were compared between groups. Results: In a cohort of 663 LDKT, 72 (11%) had pre-formed HLA DSA to either Class I, Class II HLA molecules, or both with mean fluorescence intensity (MFI) < 6000 detected within 6 months of transplantation (DSA+), while another 9 (1.4%) of patients had detectable DSA on a sample that preceded transplant date by 6 months (historic DSA+). Any type of rejection occurred in 150/663 (23%) of the cohort with a statistically significant difference between DSA negative and historic DSA positive patients (20%) and DSA positive patients (39%) (p < 0.01). Similarly, AMR on for-cause biopsy was found in 7% of DSA negative and historic DSA positive patients and 29% in DSA positive patients (p < 0.01). The incidence of ACR alone was not statistically different between the groups. In a cox-regression analysis only age (which had a minimal effect) and DSA positivity were associated with increased hazard ratio of acute rejection and graft failure. Conclusion: Single-dose IVIG given at the time of transplant can facilitate living donation in recipients with preformed DSA with acceptable acute rejection and graft failure rates. However, the risk is not completely mitigated by this regimen
Recommended from our members
Pre-emptive IVIG for Donor Specific Antibody Positive Living Donor Kidney Transplant Recipients
Background: Intravenous immunoglobulin (IVIG) is used alone and in combination with other therapies in desensitization regimens to modulate anti-human leukocyte antigen (HLA) donor-specific antibody (DSA) and facilitate transplantation of sensitized patients; however, the risk of acute antibody mediated rejection (ABMR) remains high (40-60%) and long-term graft survival remains unclear. Objective: The aim of the study is to evaluate the impact of a single high dose IVIG protocol on the incidence of acute rejection and long-term graft survival in living donor kidney transplant (LDKT) recipients with preformed HLA DSA. Methods: We retrospectively evaluated 663 adult (LDKT) recipients transplanted at our institution with and without preformed DSA between 2005-2013 with median follow-up of 5 years. We analyzed recipients with preformed DSA that received a single high dose of IVIG (2g/kg) at the time of transplant according to DSA Class (I, II or both) and compared them to each other and to DSA negative patients and patients with historic DSA (detected 6 months preceding transplantation). Rates of acute rejection and renal allograft survival were compared between groups. Results: In a cohort of 663 LDKT, 72 (11%) had pre-formed HLA DSA to either Class I, Class II HLA molecules, or both with mean fluorescence intensity (MFI) < 6000 detected within 6 months of transplantation (DSA+), while another 9 (1.4%) of patients had detectable DSA on a sample that preceded transplant date by 6 months (historic DSA+). Any type of rejection occurred in 150/663 (23%) of the cohort with a statistically significant difference between DSA negative and historic DSA positive patients (20%) and DSA positive patients (39%) (p < 0.01). Similarly, AMR on for-cause biopsy was found in 7% of DSA negative and historic DSA positive patients and 29% in DSA positive patients (p < 0.01). The incidence of ACR alone was not statistically different between the groups. In a cox-regression analysis only age (which had a minimal effect) and DSA positivity were associated with increased hazard ratio of acute rejection and graft failure. Conclusion: Single-dose IVIG given at the time of transplant can facilitate living donation in recipients with preformed DSA with acceptable acute rejection and graft failure rates. However, the risk is not completely mitigated by this regimen
Frailty and Age-Associated Assessments Associated with Chronic Kidney Disease and Transplantation Outcomes
Background. Frailty is often defined as a decrease in physiological reserve and has been shown to be correlated with adverse health outcomes and mortality in the general population. This condition is highly prevalent in the chronic kidney disease (CKD) patient population as well as in kidney transplant (KT) recipients. Other age-associated changes include sarcopenia, nutrition, cognition, and depression. In assessing the contributions of these components to patient outcomes and their prevalence in the CKD and KT patient population, it can be determined how such variables may be associated with frailty and the extent to which they may impact the adverse outcomes an individual may experience. Objectives. We sought to perform a systematic literature review to review published data on frailty and associated age-associated syndromes in CKD and KT patients. Results. Over 80 references pertinent to frailty, sarcopenia, nutrition, cognition, or depression in patients with CKD or KT were identified. Systematic review was performed to evaluate the data supporting the use of the following approaches: Fried Frailty, Short Physical Performance Battery, Frailty Index, Sarcopenia Index, CT scan quantification of muscle mass, health-related quality of life, and assessment tools for nutrition, cognition, and depression. Conclusion. This report represents a comprehensive review of previously published research articles on this topic. The intersectionality between all these components in contributing to the patient’s clinical status suggests a need for a multifaceted approach to developing comprehensive care and treatment for the CKD and KT population to improve outcomes before and after transplantation
Recommended from our members
Cold Ischemia Time, Kidney Donor Profile Index, and Kidney Transplant Outcomes: A Cohort Study
Rationale & objectiveAn average of 3,280 recovered deceased donor kidneys are discarded annually in the United States. Increased cold ischemia time is associated with an increased rate of organ decline and subsequent discard. Here we examined the effect of prolonged cold ischemia time on kidney transplant outcomes.Study designRetrospective observational study.Setting & participantsRecipients of deceased donor kidney transplants in the United States from 2000 to 2018.ExposureRecipients of deceased donor kidneys were divided based on documented cold ischemia time: ≤16, 16-24, 24-32, 32-40, and >40 hours.OutcomesThe incidence of delayed graft function, primary nonfunction, and 10-year death-censored graft survival.Analytical approachThe Kaplan-Meier method was used to generate survival curves, and the log rank test was used to compare graft survival.ResultsThe rate of observed delayed graft function increased with cold ischemia time (20.9%, 28.1%, 32.4%, 37.5%, and 35.8%). Primary nonfunction also showed a similar increase with cold ischemia time (0.6%, 0.9%, 1.3%, 2.1%, and 2.3%), During a median follow-up time of 4.6 years, 37,301 recipients experienced death-censored graft failure. Analysis based on kidney donor profile index (KDPI) demonstrated significant differences in 10-year death-censored graft survival, with a death-censored graft survival in recipients of a kidney with a KDPI <85% of 71.0% (95% CI, 70.5%-71.5%), 70.5% (95% CI, 69.9%-71.0%), 69.6% (95% CI, 68.7%-70.4%), 65.5% (95% CI, 63.7%-67.3%), and 67.2% (95% CI, 64.6%-69.6%), compared to 53.5% (95% CI, 51.1%-55.8%), 50.7% (95% CI, 48.3%-53.1%), 50.3% (95% CI, 46.6%-53.8%), 50.7% (95% CI, 45.1%-56.1%), and 48.3% (95% CI, 40.0%-56.1%), for recipients of a kidney with a KDPI >85%.LimitationsHeterogeneity of acceptance patterns among transplant centers, presence of confounding variables leading to acceptance of kidneys with prolonged cold ischemia times.ConclusionsCold ischemia time was associated with an increased risk of delayed graft function and primary nonfunction. However, the effect of increased cold ischemia time is modest and has less impact than the KDPI. Transplant programs should not consider prolonged cold ischemia time alone as a predominant reason to decline an organ, especially with a KDPI <85%