72 research outputs found

    A Systematic Approach to Assess the Activity and Classification of PCSK9 Variants

    Get PDF
    Background: Gain of function (GOF) mutations of PCSK9 cause autosomal dominant familial hypercholesterolemia as they reduce the abundance of LDL receptor (LDLR) more efficiently than wild-type PCSK9. In contrast, PCSK9 loss of function (LOF) variants are associated with a hypocholesterolemic phenotype. Dozens of PCSK9 variants have been reported, but most remain of unknown significance since their characterization has not been conducted. Objective: Our aim was to make the most comprehensive assessment of PCSK9 variants and to determine the simplest approach for the classification of these variants. Methods: The expression, maturation, secretion, and activity of nine well-established PCSK9 variants were assessed in transiently transfected HEK293 cells by Western blot and flow cytometry. Their extracellular activities were determined in HepG2 cells incubated with the purified recombinant PCSK9 variants. Their binding affinities toward the LDLR were determined by solid-phase immunoassay. Results: LDLR expression increased when cells were transfected with LOF variants and reduced when cells were transfected with GOF variants compared with wild-type PCSK9. Extracellular activities measurements yielded exactly similar results. GOF and LOF variants had increased, respectively reduced, affinities for the LDLR compared with wild-type PCSK9 with the exception of one GOF variant (R218S) that showed complete resistance to inactivation by furin. All variants were expressed at similar levels and underwent normal maturation and secretion patterns except for two LOF and two GOF mutants. Conclusions: We propose that transient transfections of HEK293 cells with a plasmid encoding a PCSK9 variant followed by LDLR expression assessment by flow cytometry is sufficient to reliably determine its GOF or LOF status. More refined experiments should only be used to determine the underlying mechanism(s) at hand.This work was supported by the Basque Government (Grupos Consolidados IT-1264-19). GL is supported by the Agence Nationale de la Recherche (Paris, France) Program Grant CHOPIN (CHolesterol Personalized Innovation) ANR-16-RHUS-0007 and Project Grant KRINGLE2 ANR-20-CE14-0009 as well as by La Fondation De France (FDF-00096274). U.G-G. was supported by Fundación Biofísica Bizkaia. A.B.-V. was supported by Programa de especialización de Personal Investigador Doctor en la UPV/EHU (2019) 2019-2020. A.L.-S. was supported by a grant PIF (2019–2020), Gobierno Vasco, and partially supported by Fundación Biofísica Bizkaia. KC and AKJ received a scholarship from the European Union (European Regional Development Fund INTERREG V) and the Région Réunion (Saint-Denis, Réunion, France)

    Impact of primary kidney disease on the effects of empagliflozin in patients with chronic kidney disease: secondary analyses of the EMPA-KIDNEY trial

    Get PDF
    Background: The EMPA KIDNEY trial showed that empagliflozin reduced the risk of the primary composite outcome of kidney disease progression or cardiovascular death in patients with chronic kidney disease mainly through slowing progression. We aimed to assess how effects of empagliflozin might differ by primary kidney disease across its broad population. Methods: EMPA-KIDNEY, a randomised, controlled, phase 3 trial, was conducted at 241 centres in eight countries (Canada, China, Germany, Italy, Japan, Malaysia, the UK, and the USA). Patients were eligible if their estimated glomerular filtration rate (eGFR) was 20 to less than 45 mL/min per 1·73 m2, or 45 to less than 90 mL/min per 1·73 m2 with a urinary albumin-to-creatinine ratio (uACR) of 200 mg/g or higher at screening. They were randomly assigned (1:1) to 10 mg oral empagliflozin once daily or matching placebo. Effects on kidney disease progression (defined as a sustained ≥40% eGFR decline from randomisation, end-stage kidney disease, a sustained eGFR below 10 mL/min per 1·73 m2, or death from kidney failure) were assessed using prespecified Cox models, and eGFR slope analyses used shared parameter models. Subgroup comparisons were performed by including relevant interaction terms in models. EMPA-KIDNEY is registered with ClinicalTrials.gov, NCT03594110. Findings: Between May 15, 2019, and April 16, 2021, 6609 participants were randomly assigned and followed up for a median of 2·0 years (IQR 1·5–2·4). Prespecified subgroupings by primary kidney disease included 2057 (31·1%) participants with diabetic kidney disease, 1669 (25·3%) with glomerular disease, 1445 (21·9%) with hypertensive or renovascular disease, and 1438 (21·8%) with other or unknown causes. Kidney disease progression occurred in 384 (11·6%) of 3304 patients in the empagliflozin group and 504 (15·2%) of 3305 patients in the placebo group (hazard ratio 0·71 [95% CI 0·62–0·81]), with no evidence that the relative effect size varied significantly by primary kidney disease (pheterogeneity=0·62). The between-group difference in chronic eGFR slopes (ie, from 2 months to final follow-up) was 1·37 mL/min per 1·73 m2 per year (95% CI 1·16–1·59), representing a 50% (42–58) reduction in the rate of chronic eGFR decline. This relative effect of empagliflozin on chronic eGFR slope was similar in analyses by different primary kidney diseases, including in explorations by type of glomerular disease and diabetes (p values for heterogeneity all >0·1). Interpretation: In a broad range of patients with chronic kidney disease at risk of progression, including a wide range of non-diabetic causes of chronic kidney disease, empagliflozin reduced risk of kidney disease progression. Relative effect sizes were broadly similar irrespective of the cause of primary kidney disease, suggesting that SGLT2 inhibitors should be part of a standard of care to minimise risk of kidney failure in chronic kidney disease. Funding: Boehringer Ingelheim, Eli Lilly, and UK Medical Research Council

    Geoeconomic variations in epidemiology, ventilation management, and outcomes in invasively ventilated intensive care unit patients without acute respiratory distress syndrome: a pooled analysis of four observational studies

    Get PDF
    Background: Geoeconomic variations in epidemiology, the practice of ventilation, and outcome in invasively ventilated intensive care unit (ICU) patients without acute respiratory distress syndrome (ARDS) remain unexplored. In this analysis we aim to address these gaps using individual patient data of four large observational studies. Methods: In this pooled analysis we harmonised individual patient data from the ERICC, LUNG SAFE, PRoVENT, and PRoVENT-iMiC prospective observational studies, which were conducted from June, 2011, to December, 2018, in 534 ICUs in 54 countries. We used the 2016 World Bank classification to define two geoeconomic regions: middle-income countries (MICs) and high-income countries (HICs). ARDS was defined according to the Berlin criteria. Descriptive statistics were used to compare patients in MICs versus HICs. The primary outcome was the use of low tidal volume ventilation (LTVV) for the first 3 days of mechanical ventilation. Secondary outcomes were key ventilation parameters (tidal volume size, positive end-expiratory pressure, fraction of inspired oxygen, peak pressure, plateau pressure, driving pressure, and respiratory rate), patient characteristics, the risk for and actual development of acute respiratory distress syndrome after the first day of ventilation, duration of ventilation, ICU length of stay, and ICU mortality. Findings: Of the 7608 patients included in the original studies, this analysis included 3852 patients without ARDS, of whom 2345 were from MICs and 1507 were from HICs. Patients in MICs were younger, shorter and with a slightly lower body-mass index, more often had diabetes and active cancer, but less often chronic obstructive pulmonary disease and heart failure than patients from HICs. Sequential organ failure assessment scores were similar in MICs and HICs. Use of LTVV in MICs and HICs was comparable (42\ub74% vs 44\ub72%; absolute difference \u20131\ub769 [\u20139\ub758 to 6\ub711] p=0\ub767; data available in 3174 [82%] of 3852 patients). The median applied positive end expiratory pressure was lower in MICs than in HICs (5 [IQR 5\u20138] vs 6 [5\u20138] cm H2O; p=0\ub70011). ICU mortality was higher in MICs than in HICs (30\ub75% vs 19\ub79%; p=0\ub70004; adjusted effect 16\ub741% [95% CI 9\ub752\u201323\ub752]; p<0\ub70001) and was inversely associated with gross domestic product (adjusted odds ratio for a US$10 000 increase per capita 0\ub780 [95% CI 0\ub775\u20130\ub786]; p<0\ub70001). Interpretation: Despite similar disease severity and ventilation management, ICU mortality in patients without ARDS is higher in MICs than in HICs, with a strong association with country-level economic status. Funding: No funding

    Hyperoxemia and excess oxygen use in early acute respiratory distress syndrome : Insights from the LUNG SAFE study

    Get PDF
    Publisher Copyright: © 2020 The Author(s). Copyright: Copyright 2020 Elsevier B.V., All rights reserved.Background: Concerns exist regarding the prevalence and impact of unnecessary oxygen use in patients with acute respiratory distress syndrome (ARDS). We examined this issue in patients with ARDS enrolled in the Large observational study to UNderstand the Global impact of Severe Acute respiratory FailurE (LUNG SAFE) study. Methods: In this secondary analysis of the LUNG SAFE study, we wished to determine the prevalence and the outcomes associated with hyperoxemia on day 1, sustained hyperoxemia, and excessive oxygen use in patients with early ARDS. Patients who fulfilled criteria of ARDS on day 1 and day 2 of acute hypoxemic respiratory failure were categorized based on the presence of hyperoxemia (PaO2 > 100 mmHg) on day 1, sustained (i.e., present on day 1 and day 2) hyperoxemia, or excessive oxygen use (FIO2 ≥ 0.60 during hyperoxemia). Results: Of 2005 patients that met the inclusion criteria, 131 (6.5%) were hypoxemic (PaO2 < 55 mmHg), 607 (30%) had hyperoxemia on day 1, and 250 (12%) had sustained hyperoxemia. Excess FIO2 use occurred in 400 (66%) out of 607 patients with hyperoxemia. Excess FIO2 use decreased from day 1 to day 2 of ARDS, with most hyperoxemic patients on day 2 receiving relatively low FIO2. Multivariate analyses found no independent relationship between day 1 hyperoxemia, sustained hyperoxemia, or excess FIO2 use and adverse clinical outcomes. Mortality was 42% in patients with excess FIO2 use, compared to 39% in a propensity-matched sample of normoxemic (PaO2 55-100 mmHg) patients (P = 0.47). Conclusions: Hyperoxemia and excess oxygen use are both prevalent in early ARDS but are most often non-sustained. No relationship was found between hyperoxemia or excessive oxygen use and patient outcome in this cohort. Trial registration: LUNG-SAFE is registered with ClinicalTrials.gov, NCT02010073publishersversionPeer reviewe

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    ConVnet BiLSTM for ASD Classification on EEG Brain Signal

    Get PDF
    As a neurodevelopmental disability, Autism Spectrum Disorder (ASD) is classified as a spectrum disorder.&nbsp; The availability of an automated technology system to classify the ASD trait would have a significant impact on paediatricians, as it would assist them in diagnosing ASD in children using a quantifiable method. In this paper, we propose a novel autism diagnosis method that is based on a hybrid of the deep learning algorithms. This hybrid consists of a convolutional neural network (ConVnet) architecture that merges two LSTM blocks (BiLSTM) with the other direction of propagation to classify the output state on the brain signal data from electroencephalogram (EEG) on individuals; typically development (TD) and autism (ASD) obtained from the Simon Foundation Autism Research Initiative (SFARI) database to classify the output state. For a 70:30 data distribution, an accuracy of 97.7 percent was achieved. Proposed methods outperformed the current state-of-the art in terms of autism classification efficiency and have the potential to make a significant contribution to neuroscience research, as demonstrated by the results
    corecore