38 research outputs found

    Haemophilia patients aged 0 - 18 years in the Western Cape

    Get PDF
    Objectives. To record the number of haemophiliacs aged 0- 18 years in the Western Cape (WC), what event led to the diagnosis, the level of clotting factor, treatment, functional status of their joints and impact of the disease on the family.Design. A prospective study of patients registered with the South African National Haemophilia Registry and new patients, utilising the patients' paediatricians, hospital records, patient and guardian interviews, physicalexamination and provincial nurse haemophilia co-ordinators.Setting. Haemophilia care centres at the three WC academic hospitals, regional hospitals and homes of patients. Two elective medical students, MHH and JJH, collected the information.Subjects. All boys with confirmed haemophilia A or B in the WC.Outcome measures. Events that led to diagnosis, degree of haemophilia, use of clotting factor, functional status, and effect on family.                                                                                                                                       Results. Of 78 patients (59 haemophilia A, 19 haemophilia B) identified, 49 could be studied. Forty-three per cent had severe, 29% moderate and 22% mild disease (6% unknown). Family history was present in 49%, but led to diagnosis in only 12%. The most common first symptoms were subcutaneous and mucosal bleeding. Delay in diagnosis varied from 0 to 9 months. Twenty-nine per cent of guardians were suspected of child abuse. RSA produced clotting factor was used 'on demand' in 73% of patients, for periodic prophylaxis in 20% and as continuous prophylaxis in 7%. Joints were functionally restricted in 43% of patients. The majority of guardians (59%) said the disease had a major impact on the family.Conclusions. The diagnosis of haemophilia in children with a positive family history was often delayed. Haemophilia causes significant morbidity in our patients and their families

    Stellar dynamics in young clusters: the formation of massive runaways and very massive runaway mergers

    Full text link
    In the present paper we combine an N-body code that simulates the dynamics of young dense stellar systems with a massive star evolution handler that accounts in a realistic way for the effects of stellar wind mass loss. We discuss two topics: 1. The formation and the evolution of very massive stars (with a mass >120 Mo) is followed in detail. These very massive stars are formed in the cluster core as a consequence of the successive (physical) collison of 10-20 most massive stars of the cluster (the process is known as runaway merging). The further evolution is governed by stellar wind mass loss during core hydrogen burning and during core helium burning (the WR phase of very massive stars). Our simulations reveal that as a consequence of runaway merging in clusters with solar and supersolar values, massive black holes can be formed but with a maximum mass of 70 Mo. In small metallicity clusters however, it cannot be excluded that the runaway merging process is responsible for pair instability supernovae or for the formation of intermediate mass black holes with a mass of several 100 Mo. 2. Massive runaways can be formed via the supernova explosion of one of the components in a binary (the Blaauw scenario) or via dynamical interaction of a single star and a binary or between two binaries in a star cluster. We explore the possibility that the most massive runaways (e.g., zeta Pup, lambda Cep, BD+433654) are the product of the collision and merger of 2 or 3 massive stars.Comment: Updated and final versio

    Dynamics of the bacterial gut microbiota during controlled human infection with Necator americanus larvae

    No full text
    Hookworms are soil-transmitted helminths that use immune-evasive strategies to persist in the human duodenum where they are responsible for anemia and protein loss. Given their location and immune regulatory effects, hookworms likely impact the bacterial microbiota. However, microbiota studies struggle to deconvolute the effect of hookworms from confounders such as coinfections and malnutrition. We thus used an experimental human hookworm infection model to explore temporal changes in the gut microbiota before and during hookworm infection. Volunteers were dermally exposed to cumulative dosages of 50, 100 or 150 L3 Necator americanus larvae. Fecal samples were collected for microbiota profiling through 16S rRNA gene amplicon sequencing at weeks zero, four, eight, fourteen and twenty. During the acute infection phase (trial week zero to eight) no changes in bacterial diversity were detected. During the established infection phase (trial week eight to twenty), bacterial richness (Chao1, p = .0174) increased significantly over all volunteers. No relation was found between larval dosage and diversity, stability or relative abundance of individual bacterial taxa. GI symptoms were associated with an unstable microbiota during the first eight weeks and rapid recovery at week twenty. Barnesiella, amongst other taxa, was more abundant in volunteers with more GI symptoms throughout the study. In conclusion, this study showed that clinical GI symptoms following N. americanus infection are associated with temporary microbiota instability and relative abundance of specific bacterial taxa. These results suggest a possible role of hookworm-induced enteritis on microbiota stability.Development and application of statistical models for medical scientific researc

    Quality indicators for appropriate antimicrobial therapy in the emergency department: a pragmatic Delphi procedure

    No full text
    Contains fulltext : 232499.pdf (Publisher’s version ) (Open Access)OBJECTIVES: Antimicrobial stewardship (AMS) has established its importance for inpatient care. AMS is, however, also urgently needed in emergency departments (ED), where many antimicrobial prescriptions are initiated. It is currently unclear what metrics stewardship teams can use to measure and improve the appropriateness of antimicrobial prescription in the ED. In this study we develop quality indicators (QIs) for antimicrobial use in the ED. METHODS: A RAND-modified Delphi procedure was used to develop a set of QIs applicable to adult patients who present at the ED with a potential infection. First, pragmatically using two recent papers of the international expert-group DRIVE-AB, potential ED-specific QIs for appropriate antimicrobial use were retrieved. Thereafter, an international multidisciplinary expert panel appraised these QIs during two questionnaire rounds with a meeting in between. RESULTS: Thirty-three potential QIs were extracted from the DRIVE-AB papers. After appraisal by 13 experts, 22 QIs describing appropriate antimicrobial use in the ED were selected. These indicators provide recommendations within five domains: stewardship prerequisites (six QIs); diagnostics (one QI); empirical treatment (ten QIs); documentation of information (four QIs); and patient discharge (one QI). CONCLUSIONS: We pragmatically developed a set of 22 QIs that can be used by stewardship teams to measure the appropriateness of antimicrobial prescription in the ED. There is probably room for additional QI development to cover all key aspects of AMS in the ED. Measuring QIs can be a first step for stewardship teams to, in collaboration with ED professionals, choose targets for improvement and optimize antimicrobial use

    Risk factors for in-hospital mortality in laboratory-confirmed COVID-19 patients in the Netherlands: A competing risk survival analysis

    Get PDF
    Contains fulltext : 232508.pdf (Publisher’s version ) (Open Access)BACKGROUND: To date, survival data on risk factors for COVID-19 mortality in western Europe is limited, and none of the published survival studies have used a competing risk approach. This study aims to identify risk factors for in-hospital mortality in COVID-19 patients in the Netherlands, considering recovery as a competing risk. METHODS: In this observational multicenter cohort study we included adults with PCR-confirmed SARS-CoV-2 infection that were admitted to one of five hospitals in the Netherlands (March to May 2020). We performed a competing risk survival analysis, presenting cause-specific hazard ratios (HRCS) for the effect of preselected factors on the absolute risk of death and recovery. RESULTS: 1,006 patients were included (63.9% male; median age 69 years, IQR: 58-77). Patients were hospitalized for a median duration of 6 days (IQR: 3-13); 243 (24.6%) of them died, 689 (69.9%) recovered, and 74 (7.4%) were censored. Patients with higher age (HRCS 1.10, 95% CI 1.08-1.12), immunocompromised state (HRCS 1.46, 95% CI 1.08-1.98), who used anticoagulants or antiplatelet medication (HRCS 1.38, 95% CI 1.01-1.88), with higher modified early warning score (MEWS) (HRCS 1.09, 95% CI 1.01-1.18), and higher blood LDH at time of admission (HRCS 6.68, 95% CI 1.95-22.8) had increased risk of death, whereas fever (HRCS 0.70, 95% CI 0.52-0.95) decreased risk of death. We found no increased mortality risk in male patients, high BMI or diabetes. CONCLUSION: Our competing risk survival analysis confirms specific risk factors for COVID-19 mortality in a the Netherlands, which can be used for prediction research, more intense in-hospital monitoring or prioritizing particular patients for new treatments or vaccination

    Few bacterial co-infections but frequent empiric antibiotic use in the early phase of hospitalized patients with COVID-19: results from a multicentre retrospective cohort study in The Netherlands

    No full text
    Contains fulltext : 229592.pdf (publisher's version ) (Open Access)BACKGROUND: Knowledge on bacterial co-infections in COVID-19 is crucial to use antibiotics appropriately. Therefore, we aimed to determine the incidence of bacterial co-infections, antibiotic use and application of antimicrobial stewardship principles in hospitalized patients with COVID-19. METHODS: We performed a retrospective observational study in four hospitals (1 university, 2 non-university teaching, 1 non-teaching hospital) in the Netherlands from March to May 2020 including consecutive patients with PCR-confirmed COVID-19. Data on first microbiological investigations obtained at the discretion of the physician and antibiotic use in the first week of hospital admission were collected. RESULTS: Twelve (1.2%) of the 925 patients included had a documented bacterial co-infection (75.0% pneumonia) within the first week. Microbiological testing was performed in 749 (81%) patients: sputum cultures in 105 (11.4%), blood cultures in 711 (76.9%), pneumococcal urinary antigen testing in 202 (21.8%), and Legionella urinary antigen testing in 199 (21.5%) patients, with clear variation between hospitals. On presentation 556 (60.1%; range 33.3-73.4%) patients received antibiotics for a median duration of 2 days (IQR 1-4). Intravenous to oral switch was performed in 41 of 413 (9.9%) patients who received intravenous treatment >48 h. Mean adherence to the local guideline on empiric antibiotic therapy on day 1 was on average 60.3% (range 45.3%-74.7%). CONCLUSIONS: On presentation to the hospital bacterial co-infections are rare, while empiric antibiotic use is abundant. This implies that in patients with COVID-19 empiric antibiotic should be withheld. This has the potential to dramatically reduce the current overuse of antibiotics in the COVID-19 pandemic

    The impact of circadian rhythm on Bacillus Calmette-Guérin vaccination effects on SARS-CoV-2 infections.

    No full text
    BACKGROUND AND OBJECTIVE: A recent study has suggested that circadian rhythm has an important impact on the immunological effects induced by Bacillus Calmette-Guérin (BCG) vaccination. The objective of this study was to evaluate whether the timing of BCG vaccination (morning or afternoon) affects its impact on severe acute respiratory syndrome-coronavirus-2 (SARS-CoV-2) infections and clinically relevant respiratory tract infections (RTIs). METHODS: This is a post-hoc analysis of the BCG-CORONA-ELDERLY (NCT04417335) multicenter, placebo-controlled trial, in which participants aged 60 years and older were randomly assigned to vaccination with BCG or placebo, and followed for 12 months. The primary endpoint was the cumulative incidence of SARS-CoV-2 infection. To assess the impact of circadian rhythm on the BCG effects, participants were divided into four groups: vaccinated with either BCG or placebo in the morning (between 9:00h and 11:30h) or in the afternoon (between 14:30h and 18:00h). RESULTS: The subdistribution hazard ratio of SARS-CoV-2 infection in the first six months after vaccination was 2.394 (95% confidence interval [CI], 0.856-6.696) for the morning BCG group and 0.284 (95% CI, 0.055-1.480) for the afternoon BCG group. When comparing those two groups, the interaction hazard ratio was 8.966 (95% CI, 1.366-58.836). In the period from six months until 12 months after vaccination cumulative incidences of SARS-CoV-2 infection were comparable, as well as cumulative incidences of clinically relevant RTI in both periods. CONCLUSION: Vaccination with BCG in the afternoon offered better protection against SARS-CoV-2 infections than BCG vaccination in the morning in the first six months after vaccination
    corecore