8 research outputs found

    Endotoxin as a Marker for Water Quality

    Get PDF
    This article belongs to the Special Issue Environmental Microbiology: Perspectives for Medicine and Public Health.Background: Water quality testing is vital to protect human health. Current testing relies mainly on culture-based detection of faecal indicator organisms such as Escherichia coli (E.coli). However, bacterial cultures are a slow process, taking 24-48 h and requiring specialised laboratories and trained personnel. Access to such laboratories is often sparse in developing countries and there are many fatalities deriving from poor water quality. Endotoxin is a molecular component of Gram-negative bacterial cell walls and can be used to detect their presence in drinking water. Method: The current study used a novel assay (BacterisK) to rapidly detect endotoxin in various water samples and correlate the results with E. coli content measured by culture methods. The data generated by the BacterisK assay are presented as an 'endotoxin risk' (ER). Results: The ER values correlate with E. coli and thus endotoxin can be used as a marker of faecal contamination in water. Moreover, the BacterisK assay provides data in near real-time and can be used in situ allowing water quality testing at different spatial and temporal locations. Conclusion: We suggest that BacterisK can be used as a convenient risk assessment tool to assess water quality where results are required quickly or access to laboratories is lacking.info:eu-repo/semantics/publishedVersio

    Endotoxin as a Marker for Water Quality

    No full text
    Background: Water quality testing is vital to protect human health. Current testing relies mainly on culture-based detection of faecal indicator organisms such as Escherichia coli (E.coli). However, bacterial cultures are a slow process, taking 24–48 h and requiring specialised laboratories and trained personnel. Access to such laboratories is often sparse in developing countries and there are many fatalities deriving from poor water quality. Endotoxin is a molecular component of Gram-negative bacterial cell walls and can be used to detect their presence in drinking water. Method: The current study used a novel assay (BacterisK) to rapidly detect endotoxin in various water samples and correlate the results with E. coli content measured by culture methods. The data generated by the BacterisK assay are presented as an ‘endotoxin risk’ (ER). Results: The ER values correlate with E. coli and thus endotoxin can be used as a marker of faecal contamination in water. Moreover, the BacterisK assay provides data in near real-time and can be used in situ allowing water quality testing at different spatial and temporal locations. Conclusion: We suggest that BacterisK can be used as a convenient risk assessment tool to assess water quality where results are required quickly or access to laboratories is lacking

    Evaluation of recombinant factor C assay for the detection of divergent lipopolysaccharide structural species and comparison with Limulus amebocyte lysate-based assays and a human monocyte activity assay

    Get PDF
    © 2017 The Authors. Purpose. The Limulus amebocytelysate (LAL) assay is widely used for the screening of lipopolysaccharide (LPS) in parenteral pharmaceuticals. However, correlation of LPS in Gram-negative bacterial infections by LAL assay has been problematic, partly due to the variable reactivity of different LPS structures. Recombinant factor C (rFC) has allowed the development of a new simple, specific and sensitive LPS detection system (PyroGene). In this work, the potential of the new assay for detecting various LPS structures has been investigated and compared with two LAL-based assays and a human monocyte activity assay. Methodology. The activity of the various LPS structures has been investigated by PyroGene and two LAL-based assays and a human monocyte activity assay. Results. The rFC assay detected most LPS structures in picogram quantities and the potency of E. coli, B. cepacia, Salmonella smooth and Salmonella R345 LPS was no different when measured with PyroGene or LAL assays. However, the reactivity of K. pneumoniae, S. marcescens, B. pertussis and P. aeruginosa LPS differed significantly between these assays. Importantly, pairwise correlation analysis revealed that only the PyroGene assay produced a significant positive correlation with the release of IL-6 from a monocytic cell line. Conclusion. We conclude that the rFC-based assay is a good replacement for conventional LAL assays and as it correlates significantly with IL-6 produced by a human monocyte cell line it could potentially be more useful for detecting LPS in a clinical setting

    Outcomes and Contemporary Trends in Surgical vs Transcatheter Aortic Valve Replacement in Patients with Chronic Obstructive Pulmonary Disease

    No full text
    Background: Chronic obstructive lung disease (COPD) is a common morbidity among patients referred for aortic valve replacement. The objective of the present study is to assess trends and outcomes of COPD patients undergoing either transcatheter aortic valve replacement (TAVR) or surgical aortic valve replacement (SAVR) for severe aortic stenosis. Methods: We analyzed the National Inpatient Sample database from January 2012 to December 2017 using the International Classification of Diseases, 9th and 10th Revision Clinical Modifications to identify all patients with COPD aged ≥50 years who underwent either TAVR or SAVR for aortic stenosis. To account for potential bias, 1:1 propensity-matched analysis was performed. Logistic regression was used for predictors of mortality in the cohort. Linear regression was used for trend analysis. Results: Of the total of 95,555 cases, 40,080 underwent TAVR whereas 49,985 underwent SAVR. In-hospital mortality for the propensity-matched cohorts was higher in the SAVR cohort compared to the TAVR group (4.6% vs. 2.5%; p \u3c 0.001). Respiratory complications were also higher in the SAVR group (7.5% vs. 3.7%; p \u3c 0.001) but were less likely to have a permanent pacemaker placement (5.3% vs. 10.8%, p \u3c 0.001). Length of stay (11.8 days [standard deviation (SD), 8.8] vs. 6.4 days [SD, 6.8]) and cost of stay (244,657[SD,244,657 [SD, 183,333] vs. 229,524[SD,229,524 [SD, 146,994]) were favorable toward TAVR as compared to SAVR. In-hospital mortality has declined over the study period in the TAVR group from 4.8% to 1.5%. Conclusion: TAVR has more favorable in-hospital outcomes in patients with COPD compared to SAVR

    Temporal Trends of Infant Mortality Secondary to Congenital Heart Disease: National CDC Cohort Analysis (1999–2020)

    No full text
    ABSTRACTBackgroundInfant mortality continues to be a significant problem for patients with congenital heart disease (CHD). Limited data exist on the recent trends of mortality in infants with CHD.MethodsThe CDC WONDER (Centers for Disease Control and Prevention Wide‐Ranging Online Data for Epidemiologic Research) was queried to identify deaths occurring within the United States with CHD listed as one of the causes of death between 1999 and 2020. Subsequently, trends were calculated using the Joinpoint regression program (version 4.9.1.0; National Cancer Institute).ResultsA total of 47,015 deaths occurred in infants due to CHD at the national level from the year 1999 to 2020. The overall proportional infant mortality (compared to all deaths) declined (47.3% to 37.1%, average annual percent change [AAPC]: −1.1 [95% CI −1.6 to −0.6, p < 0.001]). There was a significant decline in proportional mortality in both Black (45.3% to 34.3%, AAPC: −0.5 [−0.8 to −0.2, p = 0.002]) and White patients (55.6% to 48.6%, AAPC: −1.2 [−1.7 to −0.7, p = 0.001]), with a steeper decline among White than Black patients. A statistically significant decline in the proportional infant mortality in both non‐Hispanic (43.3% to 33.0%, AAPC: −1.3% [95% CI −1.9 to −0.7, p < 0.001]) and Hispanic (67.6% to 57.7%, AAPC: −0.7 [95% CI −0.9 to −0.4, p < 0.001]) patients was observed, with a steeper decline among non‐Hispanic infant population. The proportional infant mortality decreased in males (47.5% to 53.1%, AAPC: −1.4% [−1.9 to −0.9, p < 0.001]) and females (47.1% to 39.6%, AAPC: −0.9 [−1.9 to 0.0, p = 0.05]). A steady decline in for both females and males was noted.ConclusionOur study showed a significant decrease in CHD‐related mortality rate in infants and age‐adjusted mortality rate (AAMR) between 1999 and 2020. However, sex‐based, racial/ethnic disparities were noted, with female, Black, and Hispanic patients showing a lesser decline than male, White, and non‐Hispanic patients

    Cardiovascular Outcomes of Transulnar Versus Transradial Percutaneous Coronary Angiography and Intervention: A Regression Matched Meta-Analysis

    No full text
    Transradial access (TRA) and transulnar access (TUA) are in close vicinity, but TRA is the preferred intervention route. The cardiovascular outcomes and access site complications of TUA and TRA are understudied. Databases, including MEDLINE and Cochrane Central registry, were queried to find studies comparing safety outcomes of both procedures. The outcome of interest was in-hospital mortality and access site bleeding. Secondary outcomes were all-cause major adverse cardiovascular events, crossover rate, artery spasm, access site large hematoma, and access site complications between TUA and TRA. A random-effect model was used with regression to report unadjusted odds ratios (ORs) by limiting confounders and effect modifiers, using software STATA V.17. A total of 4,796 patients in 8 studies were included in our analysis (TUA = 2,420 [50.4%] and TRA = 2,376 [49.6%]). The average age was 61.3 and 60.1 years and the patients predominantly male (69.2% vs 68.4%) for TUA and TRA, respectively. TUA had lower rates of local access site bleeding (OR 0.58, 95% confidence interval 0.34 to 0.97, I2 = 1.89%, p = 0.04) but higher crossover rate (OR 1.80, 95% confidence interval 1.04 to 3.11, I2 = 75.37%, p = 0.04) than did TRA. There was no difference in in-hospital mortality, all-cause major adverse cardiovascular events, arterial spasm, and large hematoma between both cohorts. Furthermore, there was no difference in procedural time, fluoroscopy time, and contrast volume used between TUA and TRA. TUA is a safer approach, associated with lower access site bleeding but higher crossover rates, than TRA. Further prospective studies are needed to evaluate the safety and long-term outcomes of both procedures

    SARS-CoV-2 vaccination modelling for safe surgery to save lives: data from an international prospective cohort study

    No full text
    Background: Preoperative SARS-CoV-2 vaccination could support safer elective surgery. Vaccine numbers are limited so this study aimed to inform their prioritization by modelling. Methods: The primary outcome was the number needed to vaccinate (NNV) to prevent one COVID-19-related death in 1 year. NNVs were based on postoperative SARS-CoV-2 rates and mortality in an international cohort study (surgical patients), and community SARS-CoV-2 incidence and case fatality data (general population). NNV estimates were stratified by age (18-49, 50-69, 70 or more years) and type of surgery. Best- and worst-case scenarios were used to describe uncertainty. Results: NNVs were more favourable in surgical patients than the general population. The most favourable NNVs were in patients aged 70 years or more needing cancer surgery (351; best case 196, worst case 816) or non-cancer surgery (733; best case 407, worst case 1664). Both exceeded the NNV in the general population (1840; best case 1196, worst case 3066). NNVs for surgical patients remained favourable at a range of SARS-CoV-2 incidence rates in sensitivity analysis modelling. Globally, prioritizing preoperative vaccination of patients needing elective surgery ahead of the general population could prevent an additional 58 687 (best case 115 007, worst case 20 177) COVID-19-related deaths in 1 year. Conclusion: As global roll out of SARS-CoV-2 vaccination proceeds, patients needing elective surgery should be prioritized ahead of the general population
    corecore