13 research outputs found

    Immune-checkpoint proteins, cytokines, and microbiome impact on patients with cervical insufficiency and preterm birth

    Get PDF
    BackgroundMicroenvironmental factors, including microbe-induced inflammation and immune-checkpoint proteins that modulate immune cells have been associated with both cervical insufficiency and preterm delivery. These factors are incompletely understood. This study aimed to explore and compare interactions among microbiome and inflammatory factors, such as cytokines and immune-checkpoint proteins, in patients with cervical insufficiency and preterm birth. In particular, factors related to predicting preterm birth were identified and the performance of the combination of these factors was evaluated.MethodsA total of 220 swab samples from 110 pregnant women, prospectively recruited at the High-Risk Maternal Neonatal Intensive Care Center, were collected between February 2020 and March 2021. This study included 63 patients with cervical insufficiency receiving cerclage and 47 control participants. Endo- and exocervical swabs and fluids were collected simultaneously. Shotgun metagenomic sequencing for the microbiome and the measurement of 34 immune-checkpoint proteins and inflammatory cytokines were performed.ResultsFirst, we demonstrated that immune-checkpoint proteins, the key immune-regulatory molecules, could be measured in endocervical and exocervical samples. Secondly, we identified significantly different microenvironments in cervical insufficiency and preterm birth, with precise cervical locations, to provide information about practically useful cervical locations in clinical settings. Finally, the presence of Moraxella osloensis (odds ratio = 14.785; P = 0.037) and chemokine CC motif ligand 2 levels higher than 73 pg/mL (odds ratio = 40.049; P = 0.005) in endocervical samples were associated with preterm birth. Combining M. osloensis and chemokine CC motif ligand 2 yielded excellent performance for predicting preterm birth (area under the receiver operating characteristic curve = 0.846, 95% confidence interval = 0.733-0.925).ConclusionMultiple relationships between microbiomes, immune-checkpoint proteins, and inflammatory cytokines in the cervical microenvironment were identified. We focus on these factors to aid in the comprehensive understanding and therapeutic modulation of local microbial and immunologic compositions for the management of cervical insufficiency and preterm birth

    Impact of COVID-19 on Antimicrobial Consumption and Spread of Multidrug-Resistance in Bacterial Infections

    No full text
    The spread of COVID-19 pandemic may have affected antibiotic consumption patterns and the prevalence of colonized or infected by multidrug-resistant (MDR) bacteria. We investigated the differences in the consumption of antibiotics easily prone to resistance and the prevalence of MDR bacteria during the COVID-19 pandemic (March 2020 to September 2021) compared to in the pre-pandemic period (March 2018 to September 2019). Data on usage of antibiotics and infections caused by methicillin-resistant Staphylococcus aureus (MRSA), vancomycin-resistant Enterococcus (VRE), carbapenem-resistant Enterobacteriaceae (CRE), carbapenem-resistant Acinetobacter baumannii (CRAB), and carbapenem-resistant Pseudomonas aeruginosa (CRPA) were obtained from hospitalized patients in four university hospitals. The consumption of penicillin with β-lactamase inhibitors (3.4% in ward, 5.8% in intensive care unit (ICU)), and carbapenems (25.9% in ward, 12.1% in ICU) increased during the pandemic period. The prevalence of MRSA (4.7%), VRE (49.0%), CRE (22.4%), and CRPA (20.1%) isolated in clinical samples from the ward and VRE (26.7%) and CRE (36.4%) isolated in clinical samples from the ICU were significantly increased, respectively. Meanwhile, only the prevalence of CRE (38.7%) isolated in surveillance samples from the ward increased. The COVID-19 pandemic is associated with increased consumption of antibiotics and has influenced the prevalence of infections caused by MDR isolates

    Currently Used Laboratory Methodologies for Assays Detecting PD-1, PD-L1, PD-L2 and Soluble PD-L1 in Patients with Metastatic Breast Cancer

    No full text
    Approximately 20% of breast cancer (BC) patients suffer from distant metastasis. The incidence and prevalence rates of metastatic BC have increased annually. Immune checkpoint inhibitors are an emerging area of treatment, especially for metastatic patients with poor outcomes. Several antibody drugs have been developed and approved for companion testing of the programmed death protine-1 (PD-1) axis. We reviewed currently used laboratory methodologies for assays determining PD-1 axis to provide a comprehensive understanding of principles, advantages, and drawbacks involved in their implementation. The most commonly used method is immunohistochemistry (92.9%) for PD-L1 expression using tissue samples (96.4%). The commonly used anti-PD-L1 antibody clone were commercially available 22C3 (30.8%), SP142 (19.2%), SP263 (15.4%), and E1L3N (11.5%). Enzyme-linked immunosorbent assay and electrochemiluminescent immunoassay that target soluble PD-ligand (L)1 were developed and popularized in 2019–2021, in contrast to 2016–2018. Easy accessibility and non-invasiveness due to the use of blood samples, quantitative outputs, and relatively rapid turnaround times make them more preferable. Regarding scoring methods, a combination of tumor and immune cells (45.5% in 2016–2018 to 57.1% in 2019–2021) rather than each cell alone became more popular. Information about antibody clones, platforms, scoring methods, and related companion drugs is recommended for reporting PD-L1 expression

    Currently Applied Molecular Assays for Identifying ESR1 Mutations in Patients with Advanced Breast Cancer

    No full text
    Approximately 70% of breast cancers, the leading cause of cancer-related mortality worldwide, are positive for the estrogen receptor (ER). Treatment of patients with luminal subtypes is mainly based on endocrine therapy. However, ER positivity is reduced and ESR1 mutations play an important role in resistance to endocrine therapy, leading to advanced breast cancer. Various methodologies for the detection of ESR1 mutations have been developed, and the most commonly used method is next-generation sequencing (NGS)-based assays (50.0%) followed by droplet digital PCR (ddPCR) (45.5%). Regarding the sample type, tissue (50.0%) was more frequently used than plasma (27.3%). However, plasma (46.2%) became the most used method in 2016–2019, in contrast to 2012–2015 (22.2%). In 2016–2019, ddPCR (61.5%), rather than NGS (30.8%), became a more popular method than it was in 2012–2015. The easy accessibility, non-invasiveness, and demonstrated usefulness with high sensitivity of ddPCR using plasma have changed the trends. When using these assays, there should be a comprehensive understanding of the principles, advantages, vulnerability, and precautions for interpretation. In the future, advanced NGS platforms and modified ddPCR will benefit patients by facilitating treatment decisions efficiently based on information regarding ESR1 mutations

    Prevalence and Clinical Impact of Coinfection in Patients with Coronavirus Disease 2019 in Korea

    No full text
    Coinfection rates with other pathogens in coronavirus disease 2019 (COVID-19) varied during the pandemic. We assessed the latest prevalence of coinfection with viruses, bacteria, and fungi in COVID-19 patients for more than one year and its impact on mortality. A total of 436 samples were collected between August 2020 and October 2021. Multiplex real-time PCR, culture, and antimicrobial susceptibility testing were performed to detect pathogens. The coinfection rate of respiratory viruses in COVID-19 patients was 1.4%. Meanwhile, the rates of bacteria and fungi were 52.6% and 10.5% in hospitalized COVID-19 patients, respectively. Respiratory syncytial virus, rhinovirus, Acinetobacter baumannii, Escherichia coli, Pseudomonas aeruginosa, and Candida albicans were the most commonly detected pathogens. Ninety percent of isolated A. baumannii was non-susceptible to carbapenem. Based on a multivariate analysis, coinfection (odds ratio [OR] = 6.095), older age (OR = 1.089), and elevated lactate dehydrogenase (OR = 1.006) were risk factors for mortality as a critical outcome. In particular, coinfection with bacteria (OR = 11.250), resistant pathogens (OR = 11.667), and infection with multiple pathogens (OR = 10.667) were significantly related to death. Screening and monitoring of coinfection in COVID-19 patients, especially for hospitalized patients during the pandemic, are beneficial for better management and survival

    Performance of a Machine Learning-Based Methicillin Resistance of Staphylococcus aureus Identification System Using MALDI-TOF MS and Comparison of the Accuracy according to SCCmec Types

    No full text
    The prompt presumptive identification of methicillin-resistant Staphylococcus aureus (MRSA) using matrix-assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF MS) can aid in early clinical management and infection control during routine bacterial identification procedures. This study applied a machine learning approach to MALDI-TOF peaks for the presumptive identification of MRSA and compared the accuracy according to staphylococcal cassette chromosome mec (SCCmec) types. We analyzed 194 S. aureus clinical isolates to evaluate the machine learning-based identification system (AMRQuest software, v.2.1, ASTA: Suwon, Korea), which was constructed with 359 S. aureus clinical isolates for the learning dataset. This system showed a sensitivity of 91.8%, specificity of 83.3%, and accuracy of 87.6% in distinguishing MRSA. For SCCmec II and IVA types, common MRSA types in a hospital context, the accuracy was 95.4% and 96.1%, respectively, while for the SCCmec IV type, it was 21.4%. The accuracy was 90.9% for methicillin-susceptible S. aureus. This presumptive MRSA identification system may be helpful for the management of patients before the performance of routine antimicrobial resistance testing. Further optimization of the machine learning model with more datasets could help achieve rapid identification of MRSA with less effort in routine clinical procedures using MALDI-TOF MS as an identification method

    Impact of COVID-19 on Antimicrobial Consumption and Spread of Multidrug-Resistance in Bacterial Infections

    No full text
    The spread of COVID-19 pandemic may have affected antibiotic consumption patterns and the prevalence of colonized or infected by multidrug-resistant (MDR) bacteria. We investigated the differences in the consumption of antibiotics easily prone to resistance and the prevalence of MDR bacteria during the COVID-19 pandemic (March 2020 to September 2021) compared to in the pre-pandemic period (March 2018 to September 2019). Data on usage of antibiotics and infections caused by methicillin-resistant Staphylococcus aureus (MRSA), vancomycin-resistant Enterococcus (VRE), carbapenem-resistant Enterobacteriaceae (CRE), carbapenem-resistant Acinetobacter baumannii (CRAB), and carbapenem-resistant Pseudomonas aeruginosa (CRPA) were obtained from hospitalized patients in four university hospitals. The consumption of penicillin with β-lactamase inhibitors (3.4% in ward, 5.8% in intensive care unit (ICU)), and carbapenems (25.9% in ward, 12.1% in ICU) increased during the pandemic period. The prevalence of MRSA (4.7%), VRE (49.0%), CRE (22.4%), and CRPA (20.1%) isolated in clinical samples from the ward and VRE (26.7%) and CRE (36.4%) isolated in clinical samples from the ICU were significantly increased, respectively. Meanwhile, only the prevalence of CRE (38.7%) isolated in surveillance samples from the ward increased. The COVID-19 pandemic is associated with increased consumption of antibiotics and has influenced the prevalence of infections caused by MDR isolates

    Epidemiologic Trends of Thalassemia, 2006-2018: A Nationwide Population-Based Study

    No full text
    Thalassemia is the most common form of hereditary anemia. Here, we aimed to investigate the 13-year trend of the epidemiologic profiles and risk of comorbidities in thalassemia using a nationwide population-based registry in Korea. Diagnosis of thalassemia, the comorbidities and transfusion events in patients with thalassemia were identified in the Korean National Health Insurance database, which includes the entire population. The prevalence of thalassemia increased from 0.74/100,000 in 2006 to 2.76/100,000 in 2018. Notably, the incidence rate has nearly doubled in the last 2 years from 0.22/100,000 in 2016 to 0.41/100,000 in 2018. The annual transfusion rate gradually decreased from 34.7% in 2006 to 20.6% in 2018. Transfusion events in patients with thalassemia were significantly associated with the risk of comorbidities (diabetes: odds ratio [OR] = 3.68, 95% confidence interval [CI] = 2.59-5.22; hypertension: OR = 3.06, 95% CI = 2.35-4.00; dyslipidemia: OR = 1.72, 95% CI = 1.22-2.43; atrial fibrillation: OR = 3.52, 95% CI = 1.69-7.32; myocardial infarction: OR = 3.02, 95% CI = 1.09-8.38; stroke: OR = 3.32, 95% CI = 2.05-5.36; congestive heart failure: OR = 2.83, 95% CI = 1.62-4.97; end-stage renal disease: OR = 3.25, 95% CI = 1.96-5.37). Early detection of comorbidities and timely intervention are required for the management of thalassemia.N

    Epidemiologic Trends of Thalassemia, 2006–2018: A Nationwide Population-Based Study

    No full text
    Thalassemia is the most common form of hereditary anemia. Here, we aimed to investigate the 13-year trend of the epidemiologic profiles and risk of comorbidities in thalassemia using a nationwide population-based registry in Korea. Diagnosis of thalassemia, the comorbidities and transfusion events in patients with thalassemia were identified in the Korean National Health Insurance database, which includes the entire population. The prevalence of thalassemia increased from 0.74/100,000 in 2006 to 2.76/100,000 in 2018. Notably, the incidence rate has nearly doubled in the last 2 years from 0.22/100,000 in 2016 to 0.41/100,000 in 2018. The annual transfusion rate gradually decreased from 34.7% in 2006 to 20.6% in 2018. Transfusion events in patients with thalassemia were significantly associated with the risk of comorbidities (diabetes: odds ratio [OR] = 3.68, 95% confidence interval [CI] = 2.59–5.22; hypertension: OR = 3.06, 95% CI = 2.35–4.00; dyslipidemia: OR = 1.72, 95% CI = 1.22–2.43; atrial fibrillation: OR = 3.52, 95% CI = 1.69–7.32; myocardial infarction: OR = 3.02, 95% CI = 1.09–8.38; stroke: OR = 3.32, 95% CI = 2.05–5.36; congestive heart failure: OR = 2.83, 95% CI = 1.62–4.97; end-stage renal disease: OR = 3.25, 95% CI = 1.96–5.37). Early detection of comorbidities and timely intervention are required for the management of thalassemia
    corecore