34 research outputs found

    ECLS in Pediatric Cardiac Patients

    Get PDF
    Extracorporeal life support (ECLS) is an important device in the management of children with severe refractory cardiac and or pulmonary failure. Actually, two forms of ECLS are available for neonates and children: extracorporeal membrane oxygenation (ECMO) and use of a ventricular assist device (VAD). Both these techniques have their own advantages and disadvantages. The intra-aortic balloon pump is another ECLS device that has been successfully used in larger children, adolescents, and adults, but has found limited applicability in smaller children. In this review, we will present the "state of art" of ECMO in neonate and children with heart failure. ECMO is commonly used in a variety of settings to provide support to critically ill patients with cardiac disease. However, a strict selection of patients and timing of intervention should be performed to avoid the increase in mortality and morbidity of these patients. Therefore, every attempt should be done to start ECLS "urgently" rather than "emergently," before the presence of dysfunction of end organs or circulatory collapse. Even though exciting progress is being made in the development of VADs for long-term mechanical support in children, ECMO remains the mainstay of mechanical circulatory support in children with complex anatomy, particularly those needing rapid resuscitation and those with a functionally univentricular circulation. With the increase in familiarity with ECMO, new indications have been added, such as extracorporeal cardiopulmonary resuscitation (ECPR). The literature supporting ECPR is increasing in children. Reasonable survival rates have been achieved after initiation of support during active compressions of the chest following in-hospital cardiac arrest. Contraindications to ECLS have reduced in the last 5 years and many centers support patients with functionally univentricular circulations. Improved results have been recently achieved in this complex subset of patients

    Timely HAART initiation may pave the way for a better viral control

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>When to initiate antiretroviral therapy in HIV infected patients is a diffcult clinical decision. Actually, it is still a matter of discussion whether early highly active antiretroviral therapy (HAART) during primary HIV infection may influence the dynamics of the viral rebound, in case of therapy interruption, and overall the main disease course.</p> <p>Methods</p> <p>In this article we use a computational model and clinical data to identify the role of HAART timing on the residual capability to control HIV rebound after treatment suspension. Analyses of clinical data from three groups of patients initiating HAART respectively before seroconversion (very early), during the acute phase (early) and in the chronic phase (late), evidence differences arising from the very early events of the viral infection.</p> <p>Results</p> <p>The computational model allows a fine grain assessment of the impact of HAART timing on the disease outcome, from acute to chronic HIV-1 infection. Both patients' data and computer simulations reveal that HAART timing may indeed affect the HIV control capability after treatment discontinuation. In particular, we find a median time to viral rebound that is significantly longer in very early than in late patients.</p> <p>Conclusions</p> <p>A timing threshold is identified, corresponding to approximately three weeks post-infection, after which the capability to control HIV replication is lost. Conversely, HAART initiation occurring within three weeks from the infection could allow to preserve a significant control capability. This time could be related to the global triggering of uncontrolled immune activation, affecting residual immune competence preservation and HIV reservoir establishment.</p

    Real life turnaround time of blood cultures in the clinical microbiology laboratory: results of the first Italian survey, May 2015

    Get PDF
    Background and aims: Blood culture (BC) results are essential to guide antimicrobial chemotherapy for patients with sepsis. However, BC is a time-consuming exam, which can take several days. Reducing BCs turn around time (TAT) could impact on multiple outcome parameters and TAT monitoring is an important tool for measurement of microbiology laboratory performance. The aim of this study was to provide an overview of BC TATs among Italian microbiology laboratories. Materials and methods: Five laboratories collected and recorded, for a month period, date and time of the BC processing events. Cumulative TATs were analysed using the GraphPad software. Results: Participating laboratories reported data from 302 sepsis episodes. The median time from when the BC system produced a positive signal until Gram-stain results were reported was 7.6 hours. A rapid molecular identification and antimicrobial susceptibility testing (AST) was performed in 26.5% of BCs. Mean TAT for identification report was significantly lower when a molecular approach was adopted (12 vs. 28.7 hours, P&lt;0.001). Similarly, results of the molecular AST were obtained more than 24 hours in advance compared with phenotypic AST (mean 13.2 vs. 47.6, P&lt;0.001). TATs from BC positivity of laboratories opened 7 days/week were not significantly lower than those of laboratories opened 6 days/week. Conclusions: BC is a time-consuming exam, however, molecular identification and AST methods can drastically reduce time to results. The lack of difference between TATs observed for laboratories working 7 days/week and 6 days/week, coupled with a high rate of BCs turning positive during the night enable to conclude that the most urgent measure to reduce TATs is the expansion of laboratory regular duty hours

    Modeling lymphocyte homing and encounters in lymph nodes

    Get PDF
    International audienceBackgroundThe efficiency of lymph nodes depends on tissue structure and organization, which allow the coordination of lymphocyte traffic. Despite their essential role, our understanding of lymph node specific mechanisms is still incomplete and currently a topic of intense research.ResultsIn this paper, we present a hybrid discrete/continuous model of the lymph node, accounting for differences in cell velocity and chemotactic response, influenced by the spatial compartmentalization of the lymph node and the regulation of cells migration, encounter, and antigen presentation during the inflammation process.ConclusionOur model reproduces the correct timing of an immune response, including the observed time delay between duplication of T helper cells and duplication of B cells in response to antigen exposure. Furthermore, we investigate the consequences of the absence of dendritic cells at different times during infection, and the dependence of system dynamics on the regulation of lymphocyte exit from lymph nodes. In both cases, the model predicts the emergence of an impaired immune response, i.e., the response is significantly reduced in magnitude. Dendritic cell removal is also shown to delay the response time with respect to normal conditions

    A proof-of-concept study on the genomic evolution of Sars-Cov-2 in molnupiravir-treated, paxlovid-treated and drug-naĂŻve patients

    Get PDF
    Little is known about SARS-CoV-2 evolution under Molnupiravir and Paxlovid, the only antivirals approved for COVID-19 treatment. By investigating SARS-CoV-2 variability in 8 Molnupiravir-treated, 7 Paxlovid-treated and 5 drug-naive individuals at 4 time-points (Days 0-2-5-7), a higher genetic distance is found under Molnupiravir pressure compared to Paxlovid and no-drug pressure (nucleotide-substitutions/site mean &amp; PLUSMN;Standard error: 18.7 x 10(-4) &amp; PLUSMN; 2.1 x 10(-4) vs. 3.3 x 10(-4) &amp; PLUSMN; 0.8 x 10(-4) vs. 3.1 x 10(-4) &amp; PLUSMN; 0.8 x 10(-4), P = 0.0003), peaking between Day 2 and 5. Molnupiravir drives the emergence of more G-A and C-T transitions than other mutations (P = 0.031). SARS-CoV-2 selective evolution under Molnupiravir pressure does not differ from that under Paxlovid or no-drug pressure, except for orf8 (dN &gt; dS, P = 0.001); few amino acid mutations are enriched at specific sites. No RNA-dependent RNA polymerase (RdRp) or main proteases (Mpro) mutations conferring resistance to Molnupiravir or Paxlovid are found. This proof-of-concept study defines the SARS-CoV-2 within-host evolution during antiviral treatment, confirming higher in vivo variability induced by Molnupiravir compared to Paxlovid and drug-naive, albeit not resulting in apparent mutation selection

    Immune control of HIV-1 infection after therapy interruption: immediate versus deferred antiretroviral therapy

    Get PDF
    Abstract Background The optimal stage for initiating antiretroviral therapies in HIV-1 bearing patients is still a matter of debate. Methods We present computer simulations of HIV-1 infection aimed at identifying the pro et contra of immediate as compared to deferred Highly Active Antiretroviral Therapy (HAART). Results Our simulations highlight that a prompt specific CD8+ cytotoxic T lymphocytes response is detected when therapy is delayed. Compared to very early initiation of HAART, in deferred treated patients CD8+ T cells manage to mediate the decline of viremia in a shorter time and, at interruption of therapy, the virus experiences a stronger immune pressure. We also observe, however, that the immunological effects of the therapy fade with time in both therapeutic regimens. Thus, within one year from discontinuation, viral burden recovers to the value at which it would level off in the absence of therapy. In summary, simulations show that immediate therapy does not prolong the disease-free period and does not confer a survival benefit when compared to treatment started during the chronic infection phase. Conclusion Our conclusion is that, since there is no therapy to date that guarantees life-long protection, deferral of therapy should be preferred in order to minimize the risk of adverse effects, the occurrence of drug resistances and the costs of treatment.</p

    Neonatal invasive candidiasis in low-and-middle-income countries: data from the NeoOBS study

    Get PDF
    Neonatal invasive candidiasis (NIC) has significant morbidity and mortality. Reports have shown a different profile of those neonates affected with NIC and of fluconazole resistant Candida spp. isolates in low-and-middle-income -countries (LMICs) compared to high-income-countries (HIC). We describe the epidemiology, Candida spp. distribution, treatment and outcomes of neonates with NIC from LMICs enrolled in a global, prospective, longitudinal, observational cohort study (NeoOBS) of hospitalised infants < 60 days postnatal age with sepsis (August 2018-February 2021). 127 neonates from 14 hospitals in 8 countries with Candida spp. isolated from blood culture were included. Median gestational age of affected neonates was 30 weeks (IQR: 28-34) and median birth weight was 1270 g (IQR: 990-1692). Only a minority had high risk criteria, such as being born < 28 weeks, 19% (24/127), or birth weight < 1000 g, 27% (34/127). The most common Candida species were C. albicans (n = 45, 35%), C. parapsilosis (n = 38, 30%) and Candida auris (n = 18, 14%). The majority of C. albicans isolates were fluconazole susceptible, whereas 59% of C. parapsilosis isolates were fluconazole resistant. Amphotericin B was the most common antifungal used [74% (78/105)], followed by fluconazole [22% (23/105)]. Death by day 28 post-enrolment was 22% (28/127). To our knowledge, this is the largest multi-country cohort of NIC in LMICs. Most of the neonates would not have been considered at high risk for NIC in HICs. A substantial proportion of isolates was resistant to first choice fluconazole. Understanding the burden of NIC in LMIC is essential to guide future research and treatment guidelines
    corecore