36 research outputs found

    Finding a solution:Heparinised saline versus normal saline in the maintenance of invasive arterial lines in intensive care

    Get PDF
    Background We assessed the impact of heparinised saline versus 0.9% normal saline on arterial line patency. Maintaining the patency of arterial lines is essential for obtaining accurate physiological measurements, enabling blood sampling and minimising line replacement. Use of heparinised saline is associated with risks such as thrombocytopenia, haemorrhage and mis-selection. Historical studies draw variable conclusions but suggest that normal saline is at least as effective at maintaining line patency, although recent evidence has questioned this. Methods We conducted a prospective analysis of the use of heparinised saline versus normal saline on unselected patients in the intensive care of our hospital. Data concerning duration of 471 lines insertion and reason for removal was collected. Results We found a higher risk of blockage for lines flushed with normal saline compared with heparinised saline (RR = 2.15, 95% CI 1.392–3.32, p ≀ 0.001). Of the 56 lines which blocked initially (19 heparinised saline and 37 normal saline lines), 16 were replaced with new lines; 5 heparinised saline lines and 11 normal saline lines were reinserted; 5 of these lines subsequently blocked again, 3 of which were flushed with normal saline. Conclusions Our study demonstrates a clinically important reduction in arterial line longevity due to blockages when flushed with normal saline compared to heparinised saline. We have determined that these excess blockages have a significant clinical impact with further lines being inserted after blockage, resulting in increased risks to patients, wasted time and cost of resources. Our findings suggest that the current UK guidance favouring normal saline flushes should be reviewed. </jats:sec

    Cardiac implantable electronic device lead extraction in patients with underlying infection using open thoracotomy or percutaneous techniques

    Get PDF
    Background: Explanting infected cardiac implantable electronic devices (CIEDs) and extracting their associated leads can be performed percutaneously (EP) or via open-thoracotomy (OR) approach. In this study, we examined the characteristics and outcomes of infected CIED patients undergoing EP vs. OR extraction procedures. Methods: All patients (EP: n = 329 and OR: n = 24) who received lead extraction in the presence of an infected CIED from 2005 to 2010 at the University of Pittsburgh Medical Center were included in this study. Demographic and clinical characteristics were obtained from the electronic medical records. The Charlson comorbidity index (CCI) was used to adjust for severity of co-morbid conditions. Results: Compared to the EP group, OR patients were more likely to have positive blood cultures, larger vegetations, and worse CCI scores. They also had higher total mortality rates at 1 (p = 0.036), 6 (p = 0.020), and 12 months (p = 0.012) after the procedure. One-year survival after lead extractions was significantly better for the EP compared to the OR group (p = 0.002) even after adjusting for other comorbid illnesses (HR = 2.6, p = 0.010) in a Cox regression model. Conclusions: Infected CIED patients undergoing open-chest lead extraction are sicker and have higher mortality rates compared to those undergoing percutaneous extraction. Randomized, prospective data are needed to determine whether the procedural strategy for lead extraction accounts in part for the difference in outcome.

    High frequency oscillatory ventilation compared with conventional mechanical ventilation in adult respiratory distress syndrome: a randomized controlled trial [ISRCTN24242669]

    Get PDF
    INTRODUCTION: To compare the safety and efficacy of high frequency oscillatory ventilation (HFOV) with conventional mechanical ventilation (CV) for early intervention in adult respiratory distress syndrome (ARDS), a multi-centre randomized trial in four intensive care units was conducted. METHODS: Patients with ARDS were randomized to receive either HFOV or CV. In both treatment arms a priority was given to maintain lung volume while minimizing peak pressures. CV ventilation strategy was aimed at reducing tidal volumes. In the HFOV group, an open lung strategy was used. Respiratory and circulatory parameters were recorded and clinical outcome was determined at 30 days of follow up. RESULTS: The study was prematurely stopped. Thirty-seven patients received HFOV and 24 patients CV (average APACHE II score 21 and 20, oxygenation index 25 and 18 and duration of mechanical ventilation prior to randomization 2.1 and 1.5 days, respectively). There were no statistically significant differences in survival without supplemental oxygen or on ventilator, mortality, therapy failure, or crossover. Adjustment by a priori defined baseline characteristics showed an odds ratio of 0.80 (95% CI 0.22–2.97) for survival without oxygen or on ventilator, and an odds ratio for mortality of 1.15 (95% CI 0.43–3.10) for HFOV compared with CV. The response of the oxygenation index (OI) to treatment did not differentiate between survival and death. In the HFOV group the OI response was significantly higher than in the CV group between the first and the second day. A post hoc analysis suggested that there was a relatively better treatment effect of HFOV compared with CV in patients with a higher baseline OI. CONCLUSION: No significant differences were observed, but this trial only had power to detect major differences in survival without oxygen or on ventilator. In patients with ARDS and higher baseline OI, however, there might be a treatment benefit of HFOV over CV. More research is needed to establish the efficacy of HFOV in the treatment of ARDS. We suggest that future studies are designed to allow for informative analysis in patients with higher OI

    Pre-admission interventions to improve outcome after elective surgery-protocol for a systematic review

    Get PDF
    BACKGROUND: Poor physical health and fitness increases the risk of death and complications after major elective surgery. Pre-admission interventions to improve patients’ health and fitness (referred to as prehabilitation) may reduce postoperative complications, decrease the length of hospital stay and facilitate the patient’s recovery. We will conduct a systematic review of RCTs to examine the effectiveness of different types of prehabilitation interventions in improving the surgical outcomes of patients undergoing elective surgery. METHODS: This review will be conducted and reported according to the Cochrane and PRISMA reporting guidelines. MEDLINE, EMBASE, CENTRAL, CINAHL, PsycINFO, ISI Web of Science and clinical trial registers will be searched for any intervention administered before any elective surgery (including physical activity, nutritional, educational, psychological, clinical or multicomponent), which aims to improve postoperative outcomes. Reference lists of included studies will be searched, and grey literature including conference proceedings, theses, dissertations and preoperative assessment protocols will be examined. Study quality will be assessed using Cochrane’s risk of bias tool, and meta-analyses for trials that use similar interventions and report similar outcomes will be undertaken where possible. DISCUSSION: This systematic review will determine whether different types of interventions administered before elective surgery are effective in improving postoperative outcomes. It will also determine which components or combinations of components would form the most effective prehabilitation intervention. SYSTEMATIC REVIEW REGISTRATION: PROSPERO CRD4201501919

    Research prioritisation on prevention and management of preterm birth in low and middle-income countries (LMICs) with a special focus on Bangladesh using the Child Health and Nutrition Research Initiative (CHNRI) method

    Get PDF
    Background Fifteen million babies are born preterm globally each year, with 81% occurring in low- and middle-income countries (LMICs). Preterm birth complications are the leading cause of newborn deaths and significantly impact health, quality of life, and costs of health services. Improving outcomes for newborns and their families requires prioritising research for developing practical, scalable solutions, especially in low-resource settings such as Bangladesh. We aimed to identify research priorities related to preventing and managing preterm birth in LMICs for 2021-2030, with a special focus on Bangladesh. Methods We adopted the Child Health and Nutrition Research Initiative (CHNRI) method to set research priorities for preventing and managing preterm birth. Seventy-six experts submitted 490 research questions online, which we collated into 95 unique questions and sent for scoring to all experts. A hundred and nine experts scored the questions using five pre-selected criteria: answerability, effectiveness, deliverability, maximum potential for burden reduction, and effect on equity. We calculated weighted and unweighted research priority scores and average expert agreement to generate a list of top-ranked research questions for LMICs and Bangladesh. Results Health systems and policy research dominated the top 20 identified priorities for LMICs, such as understanding and improving uptake of the facility and community-based Kangaroo Mother Care (KMC), promoting breastfeeding, improving referral and transport networks, evaluating the impact of the use of skilled attendants, quality improvement activities, and exploring barriers to antenatal steroid use. Several of the top 20 questions also focused on screening high-risk women or the general population of women, understanding the causes of preterm birth, or managing preterm babies with illnesses (jaundice, sepsis and retinopathy of prematurity). There was a high overlap between research priorities in LMICs and Bangladesh. Conclusions This exercise, aimed at identifying priorities for preterm birth prevention and management research in LMICs, especially in Bangladesh, found research on improving the care of preterm babies to be more important in reducing the burden of preterm birth and accelerating the attainment of Sustainable Development Goal 3 target of newborn deaths, by 2030

    The ‘analysis of gene expression and biomarkers for point-of-care decision support in Sepsis‘ study; temporal clinical parameter analysis and validation of early diagnostic biomarker signatures for severe inflammation andsepsis-SIRS discrimination

    Get PDF
    IntroductionEarly diagnosis of sepsis and discrimination from SIRS is crucial for clinicians to provide appropriate care, management and treatment to critically ill patients. We describe identification of mRNA biomarkers from peripheral blood leukocytes, able to identify severe, systemic inflammation (irrespective of origin) and differentiate Sepsis from SIRS, in adult patients within a multi-center clinical study.MethodsParticipants were recruited in Intensive Care Units (ICUs) from multiple UK hospitals, including fifty-nine patients with abdominal sepsis, eighty-four patients with pulmonary sepsis, forty-two SIRS patients with Out-of-Hospital Cardiac Arrest (OOHCA), sampled at four time points, in addition to thirty healthy control donors. Multiple clinical parameters were measured, including SOFA score, with many differences observed between SIRS and sepsis groups. Differential gene expression analyses were performed using microarray hybridization and data analyzed using a combination of parametric and non-parametric statistical tools.ResultsNineteen high-performance, differentially expressed mRNA biomarkers were identified between control and combined SIRS/Sepsis groups (FC&gt;20.0, p&lt;0.05), termed ‘indicators of inflammation’ (I°I), including CD177, FAM20A and OLAH. Best-performing minimal signatures e.g. FAM20A/OLAH showed good accuracy for determination of severe, systemic inflammation (AUC&gt;0.99). Twenty entities, termed ‘SIRS or Sepsis’ (S°S) biomarkers, were differentially expressed between sepsis and SIRS (FC&gt;2·0, p-value&lt;0.05). DiscussionThe best performing signature for discriminating sepsis from SIRS was CMTM5/CETP/PLA2G7/MIA/MPP3 (AUC=0.9758). The I°I and S°S signatures performed variably in other independent gene expression datasets, this may be due to technical variation in the study/assay platform

    TLR activation enhances C5a-induced pro-inflammatory responses by negatively modulating the second C5a receptor, C5L2

    Get PDF
    TLR and complement activation ensures efficient clearance of infection. Previous studies documented synergism between TLRs and the receptor for the pro-inflammatory complement peptide C5a (C5aR/CD88), and regulation of TLR-induced pro-inflammatory responses by C5aR, suggesting crosstalk between TLRs and C5aR. However, it is unclear whether and how TLRs modulate C5a-induced pro-inflammatory responses. We demonstrate a marked positive modulatory effect of TLR activation on cell sensitivity to C5a in vitro and ex vivo and identify an underlying mechanistic target. Pre-exposure of PBMCs and whole blood to diverse TLR ligands or bacteria enhanced C5a-induced pro-inflammatory responses. This effect was not observed in TLR4 signalling-deficient mice. TLR-induced hypersensitivity to C5a did not result from C5aR upregulation or modulation of C5a-induced Ca2+ mobilization. Rather, TLRs targeted another C5a receptor, C5L2 (acting as a negative modulator of C5aR), by reducing C5L2 activity. TLR-induced hypersensitivity to C5a was mimicked by blocking C5L2 and was not observed in C5L2KO mice. Furthermore, TLR activation inhibited C5L2 expression upon C5a stimulation. These findings identify a novel pathway of crosstalk within the innate immune system that amplifies innate host defense at the TLR-complement interface. Unravelling the mutually regulated activities of TLRs and complement may reveal new therapeutic avenues to control inflammation

    Rheumatoid arthritis - treatment: 180. Utility of Body Weight Classified Low-Dose Leflunomide in Japanese Rheumatoid Arthritis

    Get PDF
    Background: In Japan, more than 20 rheumatoid arthritis (RA) patients died of interstitial pneumonia (IP) caused by leflunomide (LEF) were reported, but many of them were considered as the victims of opportunistic infection currently. In this paper, efficacy and safety of low-dose LEF classified by body weight (BW) were studied. Methods: Fifty-nine RA patients were started to administrate LEF from July 2007 to July 2009. Among them, 25 patients were excluded because of the combination with tacrolimus, and medication modification within 3 months before LEF. Remaining 34 RA patients administered 20 to 50 mg/week of LEF were followed up for 1 year and enrolled in this study. Dose of LEF was classified by BW (50 mg/week for over 50 kg, 40 mg/week for 40 to 50 kg and 20 to 30 mg/week for under 40 kg). The average age and RA duration of enrolled patients were 55.5 years old and 10.2 years. Prednisolone (PSL), methotrexate (MTX) and etanercept were used in 23, 28 and 2 patients, respectively. In case of insufficient response or adverse effect, dosage change or discontinuance of LEF were considered. Failure was defined as dosages up of PSL and MTX, or dosages down or discontinuance of LEF. Last observation carried forward method was used for the evaluation of failed patients at 1 year. Results: At 1 year after LEF start, good/ moderate/ no response assessed by the European League Against Rheumatism (EULAR) response criteria using Disease Activity Score, including a 28-joint count (DAS28)-C reactive protein (CRP) were showed in 14/ 10/ 10 patients, respectively. The dosage changes of LEF at 1 year were dosage up: 10, same dosage: 5, dosage down: 8 and discontinuance: 11 patients. The survival rate of patients in this study was 23.5% (24 patients failed) but actual LEF continuous rate was 67.6% (11 patients discontinued) at 1 year. The major reason of failure was liver dysfunction, and pneumocystis pneumonia was occurred in 1 patient resulted in full recovery. One patient died of sepsis caused by decubitus ulcer infection. DAS28-CRP score was decreased from 3.9 to 2.7 significantly. Although CRP was decreased from 1.50 to 0.93 mg/dl, it wasn't significant. Matrix metalloproteinase (MMP)-3 was decreased from 220.0 to 174.2 ng/ml significantly. Glutamate pyruvate transaminase (GPT) was increased from 19 to 35 U/l and number of leukocyte was decreased from 7832 to 6271 significantly. DAS28-CRP, CRP, and MMP-3 were improved significantly with MTX, although they weren't without MTX. Increase of GPT and leukopenia were seen significantly with MTX, although they weren't without MTX. Conclusions: It was reported that the risks of IP caused by LEF in Japanese RA patients were past IP history, loading dose administration and low BW. Addition of low-dose LEF is a potent safe alternative for the patients showing unsatisfactory response to current medicines, but need to pay attention for liver function and infection caused by leukopenia, especially with MTX. Disclosure statement: The authors have declared no conflicts of interes

    The development and validation of a scoring tool to predict the operative duration of elective laparoscopic cholecystectomy

    Get PDF
    Background: The ability to accurately predict operative duration has the potential to optimise theatre efficiency and utilisation, thus reducing costs and increasing staff and patient satisfaction. With laparoscopic cholecystectomy being one of the most commonly performed procedures worldwide, a tool to predict operative duration could be extremely beneficial to healthcare organisations. Methods: Data collected from the CholeS study on patients undergoing cholecystectomy in UK and Irish hospitals between 04/2014 and 05/2014 were used to study operative duration. A multivariable binary logistic regression model was produced in order to identify significant independent predictors of long (> 90 min) operations. The resulting model was converted to a risk score, which was subsequently validated on second cohort of patients using ROC curves. Results: After exclusions, data were available for 7227 patients in the derivation (CholeS) cohort. The median operative duration was 60 min (interquartile range 45–85), with 17.7% of operations lasting longer than 90 min. Ten factors were found to be significant independent predictors of operative durations > 90 min, including ASA, age, previous surgical admissions, BMI, gallbladder wall thickness and CBD diameter. A risk score was then produced from these factors, and applied to a cohort of 2405 patients from a tertiary centre for external validation. This returned an area under the ROC curve of 0.708 (SE = 0.013, p  90 min increasing more than eightfold from 5.1 to 41.8% in the extremes of the score. Conclusion: The scoring tool produced in this study was found to be significantly predictive of long operative durations on validation in an external cohort. As such, the tool may have the potential to enable organisations to better organise theatre lists and deliver greater efficiencies in care
    corecore