30 research outputs found
The role of the humoral immune response to Clostridium difficile toxins A and B in susceptibility to C. difficile infection: a case-control study
Antibody levels to Clostridium difficile toxin A (TcdA), but not toxin B (TcdB), have been found to determine risk of C. difficile infection (CDI). Historically, TcdA was thought to be the key virulence factor; however the importance of TcdB in disease is now established. We re-evaluated the role of antibodies to TcdA and TcdB in determining patient susceptibility to CDI in two separate patient cohorts. In contrast to earlier studies, we find that CDI patients have lower pre-existing IgA titres to TcdB, but not TcdA, when compared to control patients. Our findings suggest that mucosal immunity to TcdB may be important in the early stages of infection and identifies a possible target for preventing CDI progression
Toxigenic Clostridium difficile colonization among hospitalised adults; risk factors and impact on survival
Objectives: To establish risk factors for Clostridium difficile colonization among hospitalized patients in England.
Methods: Patients admitted to elderly medicine wards at three acute hospitals in England were recruited to a prospective observational study. Participants were asked to provide a stool sample as soon as possible after enrolment and then weekly during their hospital stay. Samples were cultured for C. difficile before ribotyping and toxin detection by PCR. A multivariable logistic regression model of risk factors for C. difficile colonization was fitted from univariable risk factors significant at the p < 0.05 level.
Results: 410/727 participants submitted ≥1 stool sample and 40 (9.8%) carried toxigenic C. difficile in the first sample taken. Ribotype 106 was identified three times and seven other ribotypes twice. No ribotype 027 strains were identified. Independent predictors of colonization were previous C. difficile infection (OR 4.53 (95% C.I. 1.33–15.48) and malnutrition (MUST score ≥2) (OR 3.29 (95% C.I. 1.47–7.35)). Although C. difficile colonised patients experienced higher 90-day mortality, colonization was not an independent risk for death.
Conclusions: In a non-epidemic setting patients who have previously had CDI and have a MUST score of ≥2 are at increased risk of C. difficile colonization and could be targeted for active surveillance to prevent C. difficile transmission
Evaluation of appendicitis risk prediction models in adults with suspected appendicitis
Background
Appendicitis is the most common general surgical emergency worldwide, but its diagnosis remains challenging. The aim of this study was to determine whether existing risk prediction models can reliably identify patients presenting to hospital in the UK with acute right iliac fossa (RIF) pain who are at low risk of appendicitis.
Methods
A systematic search was completed to identify all existing appendicitis risk prediction models. Models were validated using UK data from an international prospective cohort study that captured consecutive patients aged 16–45 years presenting to hospital with acute RIF in March to June 2017. The main outcome was best achievable model specificity (proportion of patients who did not have appendicitis correctly classified as low risk) whilst maintaining a failure rate below 5 per cent (proportion of patients identified as low risk who actually had appendicitis).
Results
Some 5345 patients across 154 UK hospitals were identified, of which two‐thirds (3613 of 5345, 67·6 per cent) were women. Women were more than twice as likely to undergo surgery with removal of a histologically normal appendix (272 of 964, 28·2 per cent) than men (120 of 993, 12·1 per cent) (relative risk 2·33, 95 per cent c.i. 1·92 to 2·84; P < 0·001). Of 15 validated risk prediction models, the Adult Appendicitis Score performed best (cut‐off score 8 or less, specificity 63·1 per cent, failure rate 3·7 per cent). The Appendicitis Inflammatory Response Score performed best for men (cut‐off score 2 or less, specificity 24·7 per cent, failure rate 2·4 per cent).
Conclusion
Women in the UK had a disproportionate risk of admission without surgical intervention and had high rates of normal appendicectomy. Risk prediction models to support shared decision‐making by identifying adults in the UK at low risk of appendicitis were identified
Effect of Regulatory Requirement for Patient-Specific Prescriptions for Off-Label Medications on the Use of Intravitreal Bevacizumab
Switching to brolucizumab: injection intervals and visual, anatomical and safety outcomes at 12 and 18 months in real-world eyes with neovascular age-related macular degeneration
Abstract
Background
The anti-vascular endothelial growth factor (anti-VEGF) injection interval influences treatment burden and compliance in neovascular age-related macular degeneration (nAMD). This real-world study investigates visual acuity (VA), injection-interval extension, central macular thickness (CMT) and safety in nAMD eyes switched to the anti-VEGF agent brolucizumab and followed for up to 18 months.
Methods
This retrospective study included patients with nAMD who were switched from other anti-VEGF agents to brolucizumab only. Patient eyes were grouped into three nested cohorts with the overall cohort receiving ≥ 1 brolucizumab injection, the second receiving ≥ 3 brolucizumab injections with a follow-up period of ≥ 12 months and the third cohort receiving ≥ 3 brolucizumab injections with a follow-up period of ≥ 18 months. Study endpoints included changes from baseline at 12 or 18 months in VA, injection intervals, and CMT. Sub-group analyses were conducted using baseline injection interval length or baseline VA as qualifiers.
Results
Overall, 482 eyes received ≥ 1 brolucizumab injection; 174 eyes received ≥ 3 brolucizumab injections with ≥ 12 months of follow-up, and 95 eyes received ≥ 3 brolucizumab injections with ≥ 18 months of follow-up. VA (mean [95% confidence intervals]) remained stable relative to baseline after 12 months (− 1.1 [− 3.7, 1.6] letters; p = 0.42) and 18 months (0.0 [− 3.1, 3.1] letters; p = 0.98) of brolucizumab treatment, respectively, and pre-switch injection intervals or baseline VA had no notable effect. Following the switch to brolucizumab, injection intervals were extended from baseline to month 12 by 26.9 (19.7, 34.0) days (p < 0.0001), and eyes with pre-switch injection intervals < 8 weeks were able to have their injection intervals extended by 23.6 days longer than eyes with pre-switch injection intervals ≥ 8 weeks. At 18 months, injection intervals were extended by 36.3 (25.6, 46.9) days (p < 0.0001) compared to baseline. Following switch to brolucizumab, CMT was reduced at both 12 and 18 months (12 months: − 35.2 (− 51.7, − 18.8) µm, p < 0.0001; 18 months: − 38.9 (− 54.3, − 22.0) µm, p < 0.0001). Intraocular inflammation-related adverse events were reported in 4.6% of brolucizumab-treated eyes.
Conclusions
This real-world study demonstrates that injection intervals may be significantly extended with maintained vision and reduced CMT in nAMD eyes switching to brolucizumab therapy from other anti-VEGFs.
</jats:sec
Transfusion-transmissible infections and transfusion-related immunomodulation
The risk of acquiring a transfusion-transmitted infection has declined in recent years. However, after human immunodeficiency virus and hepatitis B and C virus transmission were successfully reduced, new pathogens are threatening the safety of the blood supply, especially in the face of rising numbers of immunocompromised transfusion recipients. Despite new standards in the manufacture and storage of blood products, bacterial contamination still remains a considerable cause of transfusion-related morbidity and mortality. Better allograft survival in kidney transplant patients and higher cancer recurrence rate in surgical oncology patients after allogeneic blood transfusions highlighted a previously underestimated side-effect: transfusion-related immunomodulation (TRIM). The precise pathomechanism still remains uncertain; however, its mostly deleterious effects--such as a higher incidence of postoperative or nosocomial infections--is increasingly accepted. Although transfusion-related immunomodulation is thought to be mediated mainly by donor white blood cells, the benefit of leukoreduction on overall mortality and on infectious complications is highly debatable
