446 research outputs found

    Case Report:Necrotizing fasciitis caused by Staphylococcus aureus positive for a new sequence variant of exfoliative toxin E

    Get PDF
    Objectives: Necrotizing fasciitis (NF) caused by S. aureus is a rare, aggressive and rapidly progressing superficial fascia infection with a high mortality rate. The aim of this study was to identify virulence-related genes from a complete genome sequence of a methicillin-susceptible S. aureus (MSSA) isolate recovered from a monomicrobial case of NF.Materials and methods: The MSSA isolate UMCG579 was cultured from a pus collection from the subcutis of a patient with NF. The genome of isolate UMCG579 was sequenced using MinION (Oxford Nanopore) and MiSeq (illumina) platforms.Results: The genome of the UMCG579 isolate was composed of a 2,741,379 bp chromosome and did not harbor any plasmids. Virulence factor profiling identified multiple pore-forming toxin genes in the UMCG579 chromosome, including the Panton-Valentine leukocidin (PVL) genes, and none of the superantigen genes. The UMCG579 isolate harbored a new sequence variant of the recently described ete gene encoding exfoliative toxin (type E). A search in the GenBank database revealed that the new sequence variant (ete2) was exclusively found among isolates (n = 115) belonging to MLST CC152. While the majority of S. aureus ete-positive isolates were recovered from animal sources, S. aureus ete2-positive isolates originated from human carriers and human infections. Comparative genome analysis revealed that the ete2 gene was located on a 8777 bp genomic island.Conclusion: The combination of two heterogeneously distributed potent toxins, ETE2 and PVL, is likely to enhance the pathogenic ability of S. aureus isolates. Since anti-virulence therapies for the treatment of S. aureus infections continue to be explored, the understanding of specific pathogenetic mechanisms may have an important prophylactic and therapeutic value. Nevertheless, the exact contribution of ETE sequence variants to S. aureus virulence in NF infections must be determined.</p

    Lymphatic clearance of the brain: perivascular, paravascular and significance for neurodegenerative diseases

    No full text
    The lymphatic clearance pathways of the brain are different compared to the other organs of the body and have been the subject of heated debates. Drainage of brain extracellular fluids, particularly interstitial fluid (ISF) and cerebrospinal fluid (CSF), is not only important for volume regulation, but also for removal of waste products such as amyloid beta (A?). CSF plays a special role in clinical medicine, as it is available for analysis of biomarkers for Alzheimer’s disease. Despite the lack of a complete anatomical and physiological picture of the communications between the subarachnoid space (SAS) and the brain parenchyma, it is often assumed that A? is cleared from the cerebral ISF into the CSF. Recent work suggests that clearance of the brain mainly occurs during sleep, with a specific role for peri- and para-vascular spaces as drainage pathways from the brain parenchyma. However, the direction of flow, the anatomical structures involved and the driving forces remain elusive, with partially conflicting data in literature. The presence of A? in the glia limitans in Alzheimer’s disease suggests a direct communication of ISF with CSF. Nonetheless, there is also the well-described pathology of cerebral amyloid angiopathy associated with the failure of perivascular drainage of A?. Herein, we review the role of the vasculature and the impact of vascular pathology on the peri- and para-vascular clearance pathways of the brain. The different views on the possible routes for ISF drainage of the brain are discussed in the context of pathological significance

    Flow-Dependent Remodeling of Small Arteries in Mice Deficient for Tissue-Type Transglutaminase

    Get PDF
    Chronic changes in blood flow induce an adaptation of vascular caliber. Thus, arteries show inward remodeling after a reduction in blood flow. We hypothesized that this remodeling depends on the crosslinking enzyme tissue-type transglutaminase (tTG). Flow-dependent remodeling was studied in wild-type (WT) and tTG-null mice using a surgically imposed change in blood flow in small mesenteric arteries. WT mice showed inward remodeling after 2 days of low blood flow, which was absent in arteries from tTG-null mice. Yet, after continued low blood flow for 7 days, inward remodeling was similar in arteries from WT and tTG-null mice. Studying the alternative pathways of remodeling, we identified a relatively high expression of the plasma transglutaminase factor XIII in arteries of WT and tTG-null mice. In addition, vessels from both WT and tTG-null mice showed the presence of transglutaminase-specific crosslinks. An accumulation of adventitial monocytes/macrophages was found in vessels exposed to low blood flow in tTG-null mice. Because monocytes/macrophages may represent a source of factor XIII, tTG-null mice were treated with liposome-encapsulated clodronate. Elimination of monocytes/macrophages with liposome-encapsulated clodronate reduced both the expression of factor XIII and inward remodeling in tTG-null mice. In conclusion, tTG plays an important role in the inward remodeling of small arteries associated with decreased blood flow. Adventitial monocytes/macrophages are a source of factor XIII in tTG-null mice and contribute to an alternative, delayed mechanism of inward remodeling when tTG is absent

    A Joint Pharmacokinetic Model for the Simultaneous Description of Plasma and Whole Blood Tacrolimus Concentrations in Kidney and Lung Transplant Recipients

    Get PDF
    BACKGROUND AND OBJECTIVE: Historically, dosing of tacrolimus is guided by therapeutic drug monitoring (TDM) of the whole blood concentration, which is strongly influenced by haematocrit. The therapeutic and adverse effects are however expected to be driven by the unbound exposure, which could be better represented by measuring plasma concentrations.OBJECTIVE: We aimed to establish plasma concentration ranges reflecting whole blood concentrations within currently used target ranges.METHODS: Plasma and whole blood tacrolimus concentrations were determined in samples of transplant recipients included in the TransplantLines Biobank and Cohort Study. Targeted whole blood trough concentrations are 4-6 ng/mL and 7-10 ng/mL for kidney and lung transplant recipients, respectively. A population pharmacokinetic model was developed using non-linear mixed-effects modelling. Simulations were performed to infer plasma concentration ranges corresponding to whole blood target ranges.RESULTS: Plasma (n = 1973) and whole blood (n = 1961) tacrolimus concentrations were determined in 1060 transplant recipients. A one-compartment model with fixed first-order absorption and estimated first-order elimination characterised observed plasma concentrations. Plasma was linked to whole blood using a saturable binding equation (maximum binding 35.7 ng/mL, 95% confidence interval (CI) 31.0-40.4 ng/mL; dissociation constant 0.24 ng/mL, 95% CI 0.19-0.29 ng/mL). Model simulations indicate that patients within the whole blood target range are expected to have plasma concentrations (95% prediction interval) of 0.06-0.26 ng/mL and 0.10-0.93 ng/mL for kidney and lung transplant recipients, respectively.CONCLUSION: Whole blood tacrolimus target ranges, currently used to guide TDM, were translated to plasma concentration ranges of 0.06-0.26 ng/mL and 0.10-0.93 ng/mL for kidney and lung transplant recipients, respectively.</p

    The origin of the legumes is a complex paleopolyploid phylogenomic tangle closely associated with the cretaceous-paleogene (K-Pg) mass extinction event

    Get PDF
    This is the final version. Available from Oxford University Press via the DOI in this record. The consequences of the Cretaceous-Paleogene (K-Pg) boundary (KPB) mass extinction for the evolution of plant diversity remain poorly understood, even though evolutionary turnover of plant lineages at the KPB is central to understanding assembly of the Cenozoic biota. The apparent concentration of whole genome duplication (WGD) events around the KPB may have played a role in survival and subsequent diversification of plant lineages. To gain new insights into the origins of Cenozoic biodiversity, we examine the origin and early evolution of the globally diverse legume family (Leguminosae or Fabaceae). Legumes are ecologically (co-)dominant across many vegetation types, and the fossil record suggests that they rose to such prominence after the KPB in parallel with several well-studied animal clades including Placentalia and Neoaves. Furthermore, multiple WGD events are hypothesized to have occurred early in legume evolution. Using a recently inferred phylogenomic framework, we investigate the placement of WGDs during early legume evolution using gene tree reconciliation methods, gene count data and phylogenetic supernetwork reconstruction. Using 20 fossil calibrations we estimate a revised timeline of legume evolution based on 36 nuclear genes selected as informative and evolving in an approximately clock-like fashion. To establish the timing of WGDs we also date duplication nodes in gene trees. Results suggest either a pan-legume WGD event on the stem lineage of the family, or an allopolyploid event involving (some of) the earliest lineages within the crown group, with additional nested WGDs subtending subfamilies Papilionoideae and Detarioideae. Gene tree reconciliation methods that do not account for allopolyploidy may be misleading in inferring an earlier WGD event at the time of divergence of the two parental lineages of the polyploid, suggesting that the allopolyploid scenario is more likely. We show that the crown age of the legumes dates to the Maastrichtian or early Paleocene and that, apart from the Detarioideae WGD, paleopolyploidy occurred close to the KPB. We conclude that the early evolution of the legumes followed a complex history, in which multiple auto- and/or allopolyploidy events coincided with rapid diversification and in association with the mass extinction event at the KPB, ultimately underpinning the evolutionary success of the Leguminosae in the Cenozoic.Swiss National Science FoundationUniversity of ZurichNatural Sciences and Engineering Research Council of CanadaNational Environment Research CouncilFonds de la Recherche Scientifique of Belgiu

    Using a novel concept to measure outcomes in solid organ recipients provided promising results

    Get PDF
    Objectives: Efforts to evaluate the health of solid organ transplant recipients are hampered by the lack of adequate patient-reported outcome measures (PROMs) targeting this group. We developed the Transplant ePROM (TXP), which is based on a novel measurement model and administered through a mobile application to fill this gap. The main objective of this article is to elucidate how we derived the weights for different items, and to report initial empirical results. Study design and setting: The nine health items in the TXP were fatigue, skin, worry, self-reliance, activities, weight, sexuality, stooling, and memory. Via an online survey solid organ recipient participating in the TransplantLines Biobank and Cohort study (NCT03272841) were asked to describe and then compare their own health state with six other health states. Coefficients for item levels were obtained using a conditional logit model. Results: A total of 232 solid organ transplant recipients (mean age: 54 years) participated. The majority (106) were kidney recipients, followed by lung, liver, and heart recipients. Fatigue was the most frequent complaint (54%). The strongest negative coefficients were found for activities and worry, followed by self-reliance and memory. Conclusion: A set of coefficients and values were developed for TXP. The TXP score approximated an optimal health state for the majority of respondents and recipients of different organs reported comparable health states. (c) 2021 The Authors. Published by Elsevier Inc. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/

    Torquetenovirus Serum Load and Long-Term Outcomes in Renal Transplant Recipients

    Get PDF
    Following transplantation, patients must take immunosuppressive medication for life. Torquetenovirus (TTV) is thought to be marker for immunosuppression, and TTV-DNA levels after organ transplantation have been investigated, showing high TTV levels, associated with increased risk of infections, and low TTV levels associated with increased risk of rejection. However, this has been investigated in studies with relatively short follow-up periods. We hypothesized that TTV levels can be used to assess long term outcomes after renal transplantation. Serum samples of 666 renal transplant recipients were tested for TTV DNA. Samples were taken at least one year after renal transplantation, when TTV levels are thought to be relatively stable. Patient data was reviewed for graft failure, all-cause mortality and death due to infectious causes. Our data indicates that high TTV levels, sampled more than one year post-transplantation, are associated with all-cause mortality with a hazard ratio (HR) of 1.12 (95% CI, 1.02-1.23) per log10 increase in TTV viral load, (p = 0.02). Additionally, high TTV levels were also associated with death due to infectious causes (HR 1.20 (95% CI 1.01-1.43), p = 0.04). TTV levels decrease in the years following renal transplantation, but remain elevated longer than previously thought. This study shows that TTV level may aid in predicting long-term outcomes, all-cause mortality and death due to an infectious cause in renal transplant patients sampled over one year post-transplantation

    High plasma guanidinoacetate-to-homoarginine ratio is associated with high all-cause and cardiovascular mortality rate in adult renal transplant recipients

    Get PDF
    l-Arginine:glycine amidinotransferase (AGAT) is the main producer of the creatine precursor, guanidinoacetate (GAA), and l-homoarginine (hArg). We and others previously reported lower levels of circulating and urinary hArg in renal transplant recipients (RTR) compared to healthy subjects. In adults, hArg emerged as a novel risk factor for renal and cardiovascular adverse outcome. Urinary GAA was found to be lower in children and adolescents with kidney transplants compared to healthy controls. Whether GAA is also a risk factor in the renal and cardiovascular systems of adults, is not yet known. In the present study, we aimed to investigate the significance of circulating GAA and the GAA-to-hArg molar ratio (GAA/hArg) in adult RTR. We hypothesized that GAA/hArg represents a measure of the balanced state of the AGAT activity in the kidneys, and would prospectively allow assessing a potential association between GAA/hArg and long-term outcome in RTR. The median follow-up period was 5.4 years. Confounders and potential mediators of GAA/hArg associations were evaluated with multivariate linear regression analyses, and the association with all-cause and cardiovascular mortality or death-censored graft loss was studied with Cox regression analyses. The study cohort consisted of 686 stable RTR and 140 healthy kidney donors. Median plasma GAA concentration was significantly lower in the RTR compared to the kidney donors before kidney donation: 2.19 [1.77-2.70] mu M vs. 2.78 [2.89-3.35] mu M (P <0.001). In cross-sectional multivariable analyses in RTR, HDL cholesterol showed the strongest association with GAA/hArg. In prospective analyses in RTR, GAA/hArg was associated with a higher risk for all-cause mortality (hazard ratio (HR): 1.35 [95% CI 1.19-1.53]) and cardiovascular mortality (HR: 1.46 [95% CI 1.24-1.73]), independent of potential confounders. GAA but not GAA/hArg was associated with death-censored graft loss in crude survival and Cox regression analyses. The association of GAA and death-censored graft loss was lost after adjustment for eGFR. Our study suggests that in the kidneys of RTR, the AGAT-catalyzed biosynthesis of GAA is decreased. That high GAA/hArg is associated with a higher risk for all-cause and cardiovascular mortality may suggest that low plasma hArg is a stronger contributor to these adverse outcomes in RTR than GAA

    A Scoping Review of Key Health Items in Self-Report Instruments Used Among Solid Organ Transplant Recipients

    Get PDF
    The overall aim of this scoping review of the literature is twofold: (1) to provide an overview of all instruments that have been used to assess health-related quality of life (HRQoL) after solid organ transplantation and (2) to provide a list of health items they include to support future studies on the development of a new-generation HRQoL instrument. All studies that administered any form of HRQoL instrument to post-transplant solid organ recipients were identified in a comprehensive search of PubMed (MEDLINE), Embase, and Web of Science, with a cut-off date of May 2018. The search used various combinations of the following keywords: lung, heart, liver, kidney, or pancreas transplantation; quality of life; well-being; patient-reported outcome; instrument; questionnaire; and health survey. In total, 8013 distinct publications were identified and 1218 of these were selected for review. Among the instruments applied, 53 measured generic, 51 organ-specific, 271 domain-specific, and 43 transplant-specific HRQoL. A total of 78 distinct health items grouped into 16 sub-domains were identified and depicted graphically. The majority of publications did not report a logical rationale for the choice of specific HRQoL instrument. The most commonly used types of instruments were generic health instruments, followed by domain-specific instruments. Despite the availability of transplant-specific instruments, few studies applied these types of instruments. Based on the 78 items, further research is planned to develop a patient-centered, transplant-specific HRQoL instrument that is concise, easy to apply (mobile application), and specifically related to the health issues of solid organ recipients

    Phenotypic characterization of patients with deletions in the 3’-flanking SHOX region

    Get PDF
    Context. Leri–Weill dyschondrosteosis is a clinically variable skeletal dysplasia, caused by SHOX deletion or mutations, or a deletion of enhancer sequences in the 3’-flanking region. Recently, a 47.5 kb recurrent PAR1 deletion downstream of SHOX was reported, but its frequency and clinical importance are still unknown.Objective. This study aims to compare the clinical features of different sizes of deletions in the 3’-flanking SHOX region in order to determine the relevance of the regulatory sequences in this region.Design. We collected DNA from 28 families with deletions in the 3’-PAR1 region. Clinical data were available from 23 index patients and 21 relatives.Results. In 9 families (20 individuals) a large deletion ( ∼ 200–900 kb) was found and in 19 families (35 individuals) a small deletion was demonstrated, equal to the recently described 47.5 kb PAR1 deletion. Median height SDS, sitting height/height ratio SDS and the presence of Madelung deformity in patients with the 47.5 kb deletion were not significantly different from patients with larger deletions. The index patients had a median height SDS which was slightly lower than in their affected family members (p = 0.08). No significant differences were observed between male and female patients.Conclusions. The phenotype of patients with deletions in the 3’-PAR1 region is remarkably variable. Height, sitting height/height ratio and the presence of Madelung deformity were not significantly different between patients with the 47.5 kb recurrent PAR1 deletion and those with larger deletions, suggesting that this enhancer plays an important role in SHOX expression
    • …
    corecore