112 research outputs found

    Biomarkers of coagulation, endothelial function, and fibrinolysis in critically ill patients with COVID-19: A single-center prospective longitudinal study

    Get PDF
    Background: Immunothrombosis and coagulopathy in the lung microvasculature may lead to lung injury and disease progression in coronavirus disease 2019 (COVID-19). We aim to identify biomarkers of coagulation, endothelial function, and fibrinolysis that are associated with disease severity and may have prognostic potential. Methods: We performed a single-center prospective study of 14 adult COVID-19(+) intensive care unit patients who were age- and sex-matched to 14 COVID-19(−) intensive care unit patients, and healthy controls. Daily blood draws, clinical data, and patient characteristics were collected. Baseline values for 10 biomarkers of interest were compared between the three groups, and visualized using Fisher\u27s linear discriminant function. Linear repeated-measures mixed models were used to screen biomarkers for associations with mortality. Selected biomarkers were further explored and entered into an unsupervised longitudinal clustering machine learning algorithm to identify trends and targets that may be used for future predictive modelling efforts. Results: Elevated D-dimer was the strongest contributor in distinguishing COVID-19 status; however, D-dimer was not associated with survival. Variable selection identified clot lysis time, and antigen levels of soluble thrombomodulin (sTM), plasminogen activator inhibitor-1 (PAI-1), and plasminogen as biomarkers associated with death. Longitudinal multivariate k-means clustering on these biomarkers alone identified two clusters of COVID-19(+) patients: low (30%) and high (100%) mortality groups. Biomarker trajectories that characterized the high mortality cluster were higher clot lysis times (inhibited fibrinolysis), higher sTM and PAI-1 levels, and lower plasminogen levels. Conclusions: Longitudinal trajectories of clot lysis time, sTM, PAI-1, and plasminogen may have predictive ability for mortality in COVID-19

    Antimicrobial resistance among migrants in Europe: a systematic review and meta-analysis

    Get PDF
    BACKGROUND: Rates of antimicrobial resistance (AMR) are rising globally and there is concern that increased migration is contributing to the burden of antibiotic resistance in Europe. However, the effect of migration on the burden of AMR in Europe has not yet been comprehensively examined. Therefore, we did a systematic review and meta-analysis to identify and synthesise data for AMR carriage or infection in migrants to Europe to examine differences in patterns of AMR across migrant groups and in different settings. METHODS: For this systematic review and meta-analysis, we searched MEDLINE, Embase, PubMed, and Scopus with no language restrictions from Jan 1, 2000, to Jan 18, 2017, for primary data from observational studies reporting antibacterial resistance in common bacterial pathogens among migrants to 21 European Union-15 and European Economic Area countries. To be eligible for inclusion, studies had to report data on carriage or infection with laboratory-confirmed antibiotic-resistant organisms in migrant populations. We extracted data from eligible studies and assessed quality using piloted, standardised forms. We did not examine drug resistance in tuberculosis and excluded articles solely reporting on this parameter. We also excluded articles in which migrant status was determined by ethnicity, country of birth of participants' parents, or was not defined, and articles in which data were not disaggregated by migrant status. Outcomes were carriage of or infection with antibiotic-resistant organisms. We used random-effects models to calculate the pooled prevalence of each outcome. The study protocol is registered with PROSPERO, number CRD42016043681. FINDINGS: We identified 2274 articles, of which 23 observational studies reporting on antibiotic resistance in 2319 migrants were included. The pooled prevalence of any AMR carriage or AMR infection in migrants was 25·4% (95% CI 19·1-31·8; I2 =98%), including meticillin-resistant Staphylococcus aureus (7·8%, 4·8-10·7; I2 =92%) and antibiotic-resistant Gram-negative bacteria (27·2%, 17·6-36·8; I2 =94%). The pooled prevalence of any AMR carriage or infection was higher in refugees and asylum seekers (33·0%, 18·3-47·6; I2 =98%) than in other migrant groups (6·6%, 1·8-11·3; I2 =92%). The pooled prevalence of antibiotic-resistant organisms was slightly higher in high-migrant community settings (33·1%, 11·1-55·1; I2 =96%) than in migrants in hospitals (24·3%, 16·1-32·6; I2 =98%). We did not find evidence of high rates of transmission of AMR from migrant to host populations. INTERPRETATION: Migrants are exposed to conditions favouring the emergence of drug resistance during transit and in host countries in Europe. Increased antibiotic resistance among refugees and asylum seekers and in high-migrant community settings (such as refugee camps and detention facilities) highlights the need for improved living conditions, access to health care, and initiatives to facilitate detection of and appropriate high-quality treatment for antibiotic-resistant infections during transit and in host countries. Protocols for the prevention and control of infection and for antibiotic surveillance need to be integrated in all aspects of health care, which should be accessible for all migrant groups, and should target determinants of AMR before, during, and after migration. FUNDING: UK National Institute for Health Research Imperial Biomedical Research Centre, Imperial College Healthcare Charity, the Wellcome Trust, and UK National Institute for Health Research Health Protection Research Unit in Healthcare-associated Infections and Antimictobial Resistance at Imperial College London

    Multidisciplinary investigations of the diets of two post-medieval populations from London using stable isotopes and microdebris analysis

    Get PDF
    This paper presents the first multi-tissue study of diet in post-medieval London using both the stable light isotope analysis of carbon and nitrogen and analysis of microdebris in dental calculus. Dietary intake was explored over short and long timescales. Bulk bone collagen was analysed from humans from the Queen’s Chapel of the Savoy (QCS) (n = 66) and the St Barnabas/St Mary Abbots (SB) (n = 25). Incremental dentine analysis was performed on the second molar of individual QCS1123 to explore childhood dietary intake. Bulk hair samples (n = 4) were sampled from adults from QCS, and dental calculus was analysed from four other individuals using microscopy. In addition, bone collagen from a total of 46 animals from QCS (n = 11) and the additional site of Prescot Street (n = 35) was analysed, providing the first animal dietary baseline for post-medieval London. Overall, isotopic results suggest a largely C3-based terrestrial diet for both populations, with the exception of QCS1123 who exhibited values consistent with the consumption of C4 food sources throughout childhood and adulthood. The differences exhibited in δ15Ncoll across both populations likely reflect variations in diet due to social class and occupation, with individuals from SB likely representing wealthier individuals consuming larger quantities of animal and marine fish protein. Microdebris analysis results were limited but indicate the consumption of domestic cereals. This paper demonstrates the utility of a multidisciplinary approach to investigate diet across long and short timescales to further our understanding of variations in social status and mobility

    Children must be protected from the tobacco industry's marketing tactics.

    Get PDF

    Surgical site infection after gastrointestinal surgery in high-income, middle-income, and low-income countries: a prospective, international, multicentre cohort study

    Get PDF
    Background: Surgical site infection (SSI) is one of the most common infections associated with health care, but its importance as a global health priority is not fully understood. We quantified the burden of SSI after gastrointestinal surgery in countries in all parts of the world. Methods: This international, prospective, multicentre cohort study included consecutive patients undergoing elective or emergency gastrointestinal resection within 2-week time periods at any health-care facility in any country. Countries with participating centres were stratified into high-income, middle-income, and low-income groups according to the UN's Human Development Index (HDI). Data variables from the GlobalSurg 1 study and other studies that have been found to affect the likelihood of SSI were entered into risk adjustment models. The primary outcome measure was the 30-day SSI incidence (defined by US Centers for Disease Control and Prevention criteria for superficial and deep incisional SSI). Relationships with explanatory variables were examined using Bayesian multilevel logistic regression models. This trial is registered with ClinicalTrials.gov, number NCT02662231. Findings: Between Jan 4, 2016, and July 31, 2016, 13 265 records were submitted for analysis. 12 539 patients from 343 hospitals in 66 countries were included. 7339 (58·5%) patient were from high-HDI countries (193 hospitals in 30 countries), 3918 (31·2%) patients were from middle-HDI countries (82 hospitals in 18 countries), and 1282 (10·2%) patients were from low-HDI countries (68 hospitals in 18 countries). In total, 1538 (12·3%) patients had SSI within 30 days of surgery. The incidence of SSI varied between countries with high (691 [9·4%] of 7339 patients), middle (549 [14·0%] of 3918 patients), and low (298 [23·2%] of 1282) HDI (p < 0·001). The highest SSI incidence in each HDI group was after dirty surgery (102 [17·8%] of 574 patients in high-HDI countries; 74 [31·4%] of 236 patients in middle-HDI countries; 72 [39·8%] of 181 patients in low-HDI countries). Following risk factor adjustment, patients in low-HDI countries were at greatest risk of SSI (adjusted odds ratio 1·60, 95% credible interval 1·05–2·37; p=0·030). 132 (21·6%) of 610 patients with an SSI and a microbiology culture result had an infection that was resistant to the prophylactic antibiotic used. Resistant infections were detected in 49 (16·6%) of 295 patients in high-HDI countries, in 37 (19·8%) of 187 patients in middle-HDI countries, and in 46 (35·9%) of 128 patients in low-HDI countries (p < 0·001). Interpretation: Countries with a low HDI carry a disproportionately greater burden of SSI than countries with a middle or high HDI and might have higher rates of antibiotic resistance. In view of WHO recommendations on SSI prevention that highlight the absence of high-quality interventional research, urgent, pragmatic, randomised trials based in LMICs are needed to assess measures aiming to reduce this preventable complication

    Systematic Review of Potential Health Risks Posed by Pharmaceutical, Occupational and Consumer Exposures to Metallic and Nanoscale Aluminum, Aluminum Oxides, Aluminum Hydroxide and Its Soluble Salts

    Get PDF
    Aluminum (Al) is a ubiquitous substance encountered both naturally (as the third most abundant element) and intentionally (used in water, foods, pharmaceuticals, and vaccines); it is also present in ambient and occupational airborne particulates. Existing data underscore the importance of Al physical and chemical forms in relation to its uptake, accumulation, and systemic bioavailability. The present review represents a systematic examination of the peer-reviewed literature on the adverse health effects of Al materials published since a previous critical evaluation compiled by Krewski et al. (2007). Challenges encountered in carrying out the present review reflected the experimental use of different physical and chemical Al forms, different routes of administration, and different target organs in relation to the magnitude, frequency, and duration of exposure. Wide variations in diet can result in Al intakes that are often higher than the World Health Organization provisional tolerable weekly intake (PTWI), which is based on studies with Al citrate. Comparing daily dietary Al exposures on the basis of “total Al”assumes that gastrointestinal bioavailability for all dietary Al forms is equivalent to that for Al citrate, an approach that requires validation. Current occupational exposure limits (OELs) for identical Al substances vary as much as 15-fold. The toxicity of different Al forms depends in large measure on their physical behavior and relative solubility in water. The toxicity of soluble Al forms depends upon the delivered dose of Al+ 3 to target tissues. Trivalent Al reacts with water to produce bidentate superoxide coordination spheres [Al(O2)(H2O4)+ 2 and Al(H2O)6 + 3] that after complexation with O2•−, generate Al superoxides [Al(O2•)](H2O5)]+ 2. Semireduced AlO2• radicals deplete mitochondrial Fe and promote generation of H2O2, O2 • − and OH•. Thus, it is the Al+ 3-induced formation of oxygen radicals that accounts for the oxidative damage that leads to intrinsic apoptosis. In contrast, the toxicity of the insoluble Al oxides depends primarily on their behavior as particulates. Aluminum has been held responsible for human morbidity and mortality, but there is no consistent and convincing evidence to associate the Al found in food and drinking water at the doses and chemical forms presently consumed by people living in North America and Western Europe with increased risk for Alzheimer\u27s disease (AD). Neither is there clear evidence to show use of Al-containing underarm antiperspirants or cosmetics increases the risk of AD or breast cancer. Metallic Al, its oxides, and common Al salts have not been shown to be either genotoxic or carcinogenic. Aluminum exposures during neonatal and pediatric parenteral nutrition (PN) can impair bone mineralization and delay neurological development. Adverse effects to vaccines with Al adjuvants have occurred; however, recent controlled trials found that the immunologic response to certain vaccines with Al adjuvants was no greater, and in some cases less than, that after identical vaccination without Al adjuvants. The scientific literature on the adverse health effects of Al is extensive. Health risk assessments for Al must take into account individual co-factors (e.g., age, renal function, diet, gastric pH). Conclusions from the current review point to the need for refinement of the PTWI, reduction of Al contamination in PN solutions, justification for routine addition of Al to vaccines, and harmonization of OELs for Al substances

    Impact of opioid-free analgesia on pain severity and patient satisfaction after discharge from surgery: multispecialty, prospective cohort study in 25 countries

    Get PDF
    Background: Balancing opioid stewardship and the need for adequate analgesia following discharge after surgery is challenging. This study aimed to compare the outcomes for patients discharged with opioid versus opioid-free analgesia after common surgical procedures.Methods: This international, multicentre, prospective cohort study collected data from patients undergoing common acute and elective general surgical, urological, gynaecological, and orthopaedic procedures. The primary outcomes were patient-reported time in severe pain measured on a numerical analogue scale from 0 to 100% and patient-reported satisfaction with pain relief during the first week following discharge. Data were collected by in-hospital chart review and patient telephone interview 1 week after discharge.Results: The study recruited 4273 patients from 144 centres in 25 countries; 1311 patients (30.7%) were prescribed opioid analgesia at discharge. Patients reported being in severe pain for 10 (i.q.r. 1-30)% of the first week after discharge and rated satisfaction with analgesia as 90 (i.q.r. 80-100) of 100. After adjustment for confounders, opioid analgesia on discharge was independently associated with increased pain severity (risk ratio 1.52, 95% c.i. 1.31 to 1.76; P &lt; 0.001) and re-presentation to healthcare providers owing to side-effects of medication (OR 2.38, 95% c.i. 1.36 to 4.17; P = 0.004), but not with satisfaction with analgesia (beta coefficient 0.92, 95% c.i. -1.52 to 3.36; P = 0.468) compared with opioid-free analgesia. Although opioid prescribing varied greatly between high-income and low- and middle-income countries, patient-reported outcomes did not.Conclusion: Opioid analgesia prescription on surgical discharge is associated with a higher risk of re-presentation owing to side-effects of medication and increased patient-reported pain, but not with changes in patient-reported satisfaction. Opioid-free discharge analgesia should be adopted routinely

    The gonadotropins: Tissue-specific angiogenic factors?

    Full text link

    Retrospective evaluation of whole exome and genome mutation calls in 746 cancer samples

    No full text
    Funder: NCI U24CA211006Abstract: The Cancer Genome Atlas (TCGA) and International Cancer Genome Consortium (ICGC) curated consensus somatic mutation calls using whole exome sequencing (WES) and whole genome sequencing (WGS), respectively. Here, as part of the ICGC/TCGA Pan-Cancer Analysis of Whole Genomes (PCAWG) Consortium, which aggregated whole genome sequencing data from 2,658 cancers across 38 tumour types, we compare WES and WGS side-by-side from 746 TCGA samples, finding that ~80% of mutations overlap in covered exonic regions. We estimate that low variant allele fraction (VAF < 15%) and clonal heterogeneity contribute up to 68% of private WGS mutations and 71% of private WES mutations. We observe that ~30% of private WGS mutations trace to mutations identified by a single variant caller in WES consensus efforts. WGS captures both ~50% more variation in exonic regions and un-observed mutations in loci with variable GC-content. Together, our analysis highlights technological divergences between two reproducible somatic variant detection efforts

    Reducing the environmental impact of surgery on a global scale: systematic review and co-prioritization with healthcare workers in 132 countries

    Get PDF
    Abstract Background Healthcare cannot achieve net-zero carbon without addressing operating theatres. The aim of this study was to prioritize feasible interventions to reduce the environmental impact of operating theatres. Methods This study adopted a four-phase Delphi consensus co-prioritization methodology. In phase 1, a systematic review of published interventions and global consultation of perioperative healthcare professionals were used to longlist interventions. In phase 2, iterative thematic analysis consolidated comparable interventions into a shortlist. In phase 3, the shortlist was co-prioritized based on patient and clinician views on acceptability, feasibility, and safety. In phase 4, ranked lists of interventions were presented by their relevance to high-income countries and low–middle-income countries. Results In phase 1, 43 interventions were identified, which had low uptake in practice according to 3042 professionals globally. In phase 2, a shortlist of 15 intervention domains was generated. In phase 3, interventions were deemed acceptable for more than 90 per cent of patients except for reducing general anaesthesia (84 per cent) and re-sterilization of ‘single-use’ consumables (86 per cent). In phase 4, the top three shortlisted interventions for high-income countries were: introducing recycling; reducing use of anaesthetic gases; and appropriate clinical waste processing. In phase 4, the top three shortlisted interventions for low–middle-income countries were: introducing reusable surgical devices; reducing use of consumables; and reducing the use of general anaesthesia. Conclusion This is a step toward environmentally sustainable operating environments with actionable interventions applicable to both high– and low–middle–income countries
    corecore