20 research outputs found

    Successful aspiration thrombectomy in a patient with submassive, intermediate-risk pulmonary embolism following COVID-19 pneumonia

    Get PDF
    A 64-year-old female patient presented with severe dyspnea shortly after apparent recovery from COVID-19 disease. Chest computed tomography revealed central pulmonary embolism and ultrasonography showed a deep vein thrombosis of her right leg. The patient was tachycardiac with evidence of right ventricular strain on echocardiography. An interdisciplinary decision for interventional therapy was made. Angiographic aspiration thrombectomy resulted in a significant reduction of thrombus material and improved flow in the pulmonary arteries and immediate marked clinical improvement and subsequent normalization of functional echocardiographic parameters. This case adds to the emerging evidence for severe thromboembolic complications following COVID-19 and suggests aspiration thrombectomy can be considered in pulmonary embolism of intermediate risk

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    Correction: Revisiting the overlooked role of recycled sewage water in high-income countries in adenoviral outbreaks such as the “2022 pediatric hepatitis’ outbreak”

    No full text
    BACKGROUND: On the 5th of April 2022, cases of adenovirus-induced hepatitis were reported in Scotland and then reached multiple parts of the world. While adenovirus normally presents with diarrhea, vomiting, and fever, these novel cases also resulted in the development of fulminant hepatitis in non-immunocompromised cases. MAIN BODY: The responsible pathogen “Adenovirus 41” is an enterovirus. Enteroviruses are spread by the fecal-oral route and are resistant to drying. As such, they predominate in sewage water. Hepatitis is normally restricted to poorer countries, yet this new wave seems to be confined to mostly high-income countries in Europe and the USA. These countries treat and recycle a higher percentage of sewage water. We also propose that the fulminant nature of this strain could be due to either a cross-species mutation or the general decrease in trained immunity post-COVID-19 lockdown. SHORT CONCLUSION: Evidence strongly suggests that the link between these new hepatitis cases is recycled sewage water. This should warrant further investigations on the origin of this outbreak by re-visiting the role of recycled sewage water in causing such outbreak

    Prognostic value of high-sensitivity measurable residual disease assessment after front-line chemoimmunotherapy in chronic lymphocytic leukemia

    No full text
    International audienceMeasurable residual disease (MRD) status is widely adopted in clinical trials in patients with chronic lymphocytic leukemia (CLL). Findings from FILO group trials (CLL2007FMP, CLL2007SA, CLL2010FMP) enabled investigation of the prognostic value of high-sensitivity (0.7 × 10-5) MRD assessment using flow cytometry, in blood (N = 401) and bone marrow (N = 339), after fludarabine, cyclophosphamide, and rituximab (FCR)-based chemoimmunotherapy in a homogeneous population with long follow-up (median 49.5 months). Addition of low-level positive MRD < 0.01% to MRD ≥ 0.01% increased the proportion of cases with positive MRD in blood by 39% and in bone marrow by 27%. Compared to low-level positive MRD < 0.01%, undetectable MRD was associated with significantly longer progression-free survival (PFS) when using blood (72.2 versus 42.7 months; hazard ratio 0.40, p = 0.0003), but not when using bone marrow. Upon further stratification, positive blood MRD at any level, compared to undetectable blood MRD, was associated with shorter PFS irrespective of clinical complete or partial remission, and a lower 5-year PFS rate irrespective of IGHV-mutated or -unmutated status (all p < 0.05). In conclusion, high-sensitivity (0.0007%) MRD assessment in blood yielded additional prognostic information beyond the current standard sensitivity (0.01%). Our approach provides a model for future determination of the optimal MRD investigative strategy for any regimen

    Molecular Detection and Genotyping of Noroviruses

    Full text link
    Noroviruses (NoVs) are a major cause of gastroenteritis worldwide in humans and animals and are known as very infectious viral agents. They are spread through feces and vomit via several transmission routes involving person-to-person contact, food, and water. Investigation of these transmission routes requires sensitive methods for detection of NoVs. As NoVs cannot be cultivated to date, detection of these viruses relies on the use of molecular methods such as (real-time) reverse transcriptase polymerase chain reaction (RT-PCR). Regardless of the matrix, detection of NoVs generally requires three subsequent steps: a virus extraction step, RNA purification, and molecular detection of the purified RNA, occasionally followed by molecular genotyping. The current review mainly focused on the molecular detection and genotyping of NoVs. The most conserved region in the genome of human infective NoVs is the ORF1/ORF2 junction and has been used as a preferred target region for molecular detection of NoVs by methods such as (real-time) RT-PCR, NASBA, and LAMP. In case of animal NoVs, broad range molecular assays have most frequently been applied for molecular detection. Regarding genotyping of NoVs, five regions situated in the polymerase and capsid genes have been used for conventional RT-PCR amplification and sequencing. As the expected levels of NoVs on food and in water are very low and inhibition of molecular methods can occur in these matrices, quality control including adequate positive and negative controls is an essential part of NoV detection. Although the development of molecular methods for NoV detection has certainly aided in the understanding of NoV transmission, it has also led to new problems such as the question whether low levels of human NoV detected on fresh produce and shellfish could pose a threat to public health. © 2012 Springer Science+Business Media New York

    Effect of Antiplatelet Therapy on Survival and Organ Support–Free Days in Critically Ill Patients With COVID-19

    No full text
    International audienc

    Clinical and genetic characteristics of late-onset Huntington's disease

    No full text
    Background: The frequency of late-onset Huntington's disease (&gt;59 years) is assumed to be low and the clinical course milder. However, previous literature on late-onset disease is scarce and inconclusive. Objective: Our aim is to study clinical characteristics of late-onset compared to common-onset HD patients in a large cohort of HD patients from the Registry database. Methods: Participants with late- and common-onset (30–50 years)were compared for first clinical symptoms, disease progression, CAG repeat size and family history. Participants with a missing CAG repeat size, a repeat size of ≤35 or a UHDRS motor score of ≤5 were excluded. Results: Of 6007 eligible participants, 687 had late-onset (11.4%) and 3216 (53.5%) common-onset HD. Late-onset (n = 577) had significantly more gait and balance problems as first symptom compared to common-onset (n = 2408) (P &lt;.001). Overall motor and cognitive performance (P &lt;.001) were worse, however only disease motor progression was slower (coefficient, −0.58; SE 0.16; P &lt;.001) compared to the common-onset group. Repeat size was significantly lower in the late-onset (n = 40.8; SD 1.6) compared to common-onset (n = 44.4; SD 2.8) (P &lt;.001). Fewer late-onset patients (n = 451) had a positive family history compared to common-onset (n = 2940) (P &lt;.001). Conclusions: Late-onset patients present more frequently with gait and balance problems as first symptom, and disease progression is not milder compared to common-onset HD patients apart from motor progression. The family history is likely to be negative, which might make diagnosing HD more difficult in this population. However, the balance and gait problems might be helpful in diagnosing HD in elderly patients
    corecore