41 research outputs found

    Transfusion of fresh frozen plasma in non-bleeding ICU patients -TOPIC TRIAL: study protocol for a randomized controlled trial

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Fresh frozen plasma (FFP) is an effective therapy to correct for a deficiency of multiple coagulation factors during bleeding. In past years, use of FFP has increased, in particular in patients on the Intensive Care Unit (ICU), and has expanded to include prophylactic use in patients with a coagulopathy prior to undergoing an invasive procedure. Retrospective studies suggest that prophylactic use of FFP does not prevent bleeding, but carries the risk of transfusion-related morbidity. However, up to 50% of FFP is administered to non-bleeding ICU patients. With the aim to investigate whether prophylactic FFP transfusions to critically ill patients can be safely omitted, a multi-center randomized clinical trial is conducted in ICU patients with a coagulopathy undergoing an invasive procedure.</p> <p>Methods</p> <p>A non-inferiority, prospective, multicenter randomized open-label, blinded end point evaluation (PROBE) trial. In the intervention group, a prophylactic transfusion of FFP prior to an invasive procedure is omitted compared to transfusion of a fixed dose of 12 ml/kg in the control group. Primary outcome measure is relevant bleeding. Secondary outcome measures are minor bleeding, correction of International Normalized Ratio, onset of acute lung injury, length of ventilation days and length of Intensive Care Unit stay.</p> <p>Discussion</p> <p>The Transfusion of Fresh Frozen Plasma in non-bleeding ICU patients (TOPIC) trial is the first multi-center randomized controlled trial powered to investigate whether it is safe to withhold FFP transfusion to coagulopathic critically ill patients undergoing an invasive procedure.</p> <p>Trial Registration</p> <p>Trial registration: Dutch Trial Register NTR2262 and ClinicalTrials.gov: <a href="http://www.clinicaltrials.gov/ct2/show/NCT01143909">NCT01143909</a></p

    Optimally timing primaquine treatment to reduce Plasmodium falciparum transmission in low endemicity Thai-Myanmar border populations

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Effective malaria control has successfully reduced the malaria burden in many countries, but to eliminate malaria, these countries will need to further improve their control efforts. Here, a malaria control programme was critically evaluated in a very low-endemicity Thai-Myanmar border population, where early detection and prompt treatment have substantially reduced, though not ended, <it>Plasmodium falciparum </it>transmission, in part due to carriage of late-maturing gametocytes that remain post-treatment. To counter this effect, the WHO recommends the use of a single oral dose of primaquine along with an effective blood schizonticide. However, while the effectiveness of primaquine as a gametocidal agent is widely documented, the mismatch between primaquine's short half-life, the long-delay for gametocyte maturation and the proper timing of primaquine administration have not been studied.</p> <p>Methods</p> <p>Mathematical models were constructed to simulate 8-year surveillance data, between 1999 and 2006, of seven villages along the Thai-Myanmar border. A simple model was developed to consider primaquine pharmacokinetics and pharmacodynamics, gametocyte carriage, and infectivity.</p> <p>Results</p> <p>In these populations, transmission intensity is very low, so the <it>P. falciparum </it>parasite rate is strongly linked to imported malaria and to the fraction of cases not treated. Given a 3.6-day half-life of gametocyte, the estimated duration of infectiousness would be reduced by 10 days for every 10-fold reduction in initial gametocyte densities. Infectiousness from mature gametocytes would last two to four weeks and sustain some transmission, depending on the initial parasite densities, but the residual mature gametocytes could be eliminated by primaquine. Because of the short half-life of primaquine (approximately eight hours), it was immediately obvious that with early administration (within three days after an acute attack), primaquine would not be present when mature gametocytes emerged eight days after the appearance of asexual blood-stage parasites. A model of optimal timing suggests that primaquine follow-up approximately eight days after a clinical episode could further reduce the duration of infectiousness from two to four weeks down to a few days. The prospects of malaria elimination would be substantially improved by changing the timing of primaquine administration and combining this with effective detection and management of imported malaria cases. The value of using primaquine to reduce residual gametocyte densities and to reduce malaria transmission was considered in the context of a malaria transmission model; the added benefit of the primaquine follow-up treatment would be relatively large only if a high fraction of patients (>95%) are initially treated with schizonticidal agents.</p> <p>Conclusion</p> <p>Mathematical models have previously identified the long duration of <it>P. falciparum </it>asexual blood-stage infections as a critical point in maintaining malaria transmission, but infectiousness can persist for two to four weeks because of residual populations of mature gametocytes. Simulations from new models suggest that, in areas where a large fraction of malaria cases are treated, curing the asexual parasitaemia in a primary infection, and curing mature gametocyte infections with an eight-day follow-up treatment with primaquine have approximately the same proportional effects on reducing the infectious period. Changing the timing of primaquine administration would, in all likelihood, interrupt transmission in this area with very good health systems and with very low endemicity.</p

    Epigenetic abnormalities in myeloproliferative neoplasms: a target for novel therapeutic strategies

    Get PDF
    The myeloproliferative neoplasms (MPNs) are a group of clonal hematological malignancies characterized by a hypercellular bone marrow and a tendency to develop thrombotic complications and to evolve to myelofibrosis and acute leukemia. Unlike chronic myelogenous leukemia, where a single disease-initiating genetic event has been identified, a more complicated series of genetic mutations appear to be responsible for the BCR-ABL1-negative MPNs which include polycythemia vera, essential thrombocythemia, and primary myelofibrosis. Recent studies have revealed a number of epigenetic alterations that also likely contribute to disease pathogenesis and determine clinical outcome. Increasing evidence indicates that alterations in DNA methylation, histone modification, and microRNA expression patterns can collectively influence gene expression and potentially contribute to MPN pathogenesis. Examples include mutations in genes encoding proteins that modify chromatin structure (EZH2, ASXL1, IDH1/2, JAK2V617F, and IKZF1) as well as epigenetic modification of genes critical for cell proliferation and survival (suppressors of cytokine signaling, polycythemia rubra vera-1, CXC chemokine receptor 4, and histone deacetylase (HDAC)). These epigenetic lesions serve as novel targets for experimental therapeutic interventions. Clinical trials are currently underway evaluating HDAC inhibitors and DNA methyltransferase inhibitors for the treatment of patients with MPNs

    Laparoscopy in management of appendicitis in high-, middle-, and low-income countries: a multicenter, prospective, cohort study.

    Get PDF
    BACKGROUND: Appendicitis is the most common abdominal surgical emergency worldwide. Differences between high- and low-income settings in the availability of laparoscopic appendectomy, alternative management choices, and outcomes are poorly described. The aim was to identify variation in surgical management and outcomes of appendicitis within low-, middle-, and high-Human Development Index (HDI) countries worldwide. METHODS: This is a multicenter, international prospective cohort study. Consecutive sampling of patients undergoing emergency appendectomy over 6 months was conducted. Follow-up lasted 30 days. RESULTS: 4546 patients from 52 countries underwent appendectomy (2499 high-, 1540 middle-, and 507 low-HDI groups). Surgical site infection (SSI) rates were higher in low-HDI (OR 2.57, 95% CI 1.33-4.99, p = 0.005) but not middle-HDI countries (OR 1.38, 95% CI 0.76-2.52, p = 0.291), compared with high-HDI countries after adjustment. A laparoscopic approach was common in high-HDI countries (1693/2499, 67.7%), but infrequent in low-HDI (41/507, 8.1%) and middle-HDI (132/1540, 8.6%) groups. After accounting for case-mix, laparoscopy was still associated with fewer overall complications (OR 0.55, 95% CI 0.42-0.71, p < 0.001) and SSIs (OR 0.22, 95% CI 0.14-0.33, p < 0.001). In propensity-score matched groups within low-/middle-HDI countries, laparoscopy was still associated with fewer overall complications (OR 0.23 95% CI 0.11-0.44) and SSI (OR 0.21 95% CI 0.09-0.45). CONCLUSION: A laparoscopic approach is associated with better outcomes and availability appears to differ by country HDI. Despite the profound clinical, operational, and financial barriers to its widespread introduction, laparoscopy could significantly improve outcomes for patients in low-resource environments. TRIAL REGISTRATION: NCT02179112

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    IFNγ and IL-12 restrict Th2 responses during Helminth/Plasmodium co-infection and promote IFNγ from Th2 cells

    Get PDF
    Parasitic helminths establish chronic infections in mammalian hosts. Helminth/Plasmodium co-infections occur frequently in endemic areas. However, it is unclear whether Plasmodium infections compromise anti-helminth immunity, contributing to the chronicity of infection. Immunity to Plasmodium or helminths requires divergent CD4+ T cell-driven responses, dominated by IFNγ or IL-4, respectively. Recent literature has indicated that Th cells, including Th2 cells, have phenotypic plasticity with the ability to produce non-lineage associated cytokines. Whether such plasticity occurs during co-infection is unclear. In this study, we observed reduced anti-helminth Th2 cell responses and compromised anti-helminth immunity during Heligmosomoides polygyrus and Plasmodium chabaudi co-infection. Using newly established triple cytokine reporter mice (Il4gfpIfngyfpIl17aFP635), we demonstrated that Il4gfp+ Th2 cells purified from in vitro cultures or isolated ex vivo from helminth-infected mice up-regulated IFNγ following adoptive transfer into Rag1-/- mice infected with P. chabaudi. Functionally, Th2 cells that up-regulated IFNγ were transcriptionally re-wired and protected recipient mice from high parasitemia. Mechanistically, TCR stimulation and responsiveness to IL-12 and IFNγ, but not type I IFN, was required for optimal IFNγ production by Th2 cells. Finally, blockade of IL-12 and IFNγ during co-infection partially preserved anti-helminth Th2 responses. In summary, this study demonstrates that Th2 cells retain substantial plasticity with the ability to produce IFNγ during Plasmodium infection. Consequently, co-infection with Plasmodium spp. may contribute to the chronicity of helminth infection by reducing anti-helminth Th2 cells and converting them into IFNγ-secreting cells

    TET proteins and the control of cytosine demethylation in cancer

    Get PDF

    Driver mutations of cancer epigenomes

    Get PDF

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone
    corecore