68 research outputs found
Risk scoring systems for adults admitted to the emergency department: a systematic review
<p>Abstract</p> <p>Background</p> <p>Patients referred to a medical admission unit (MAU) represent a broad spectrum of disease severity. In the interest of allocating resources to those who might potentially benefit most from clinical interventions, several scoring systems have been proposed as a triaging tool.</p> <p>Even though most scoring systems are not meant to be used on an individual level, they can support the more inexperienced doctors and nurses in assessing the risk of deterioration of their patients.</p> <p>We therefore performed a systematic review on the level of evidence of literature on scoring systems developed or validated in the MAU. We hypothesized that existing scoring systems would have a low level of evidence and only few systems would have been externally validated.</p> <p>Methods</p> <p>We conducted a systematic search using Medline, EMBASE and the Cochrane Library, according to the PRISMA guidelines, on scoring systems developed to assess medical patients at admission.</p> <p>The primary endpoints were in-hospital mortality or transfer to the intensive care unit. Studies derived for only a single or few diagnoses were excluded.</p> <p>The ability to identify patients at risk (discriminatory power) and agreement between observed and predicted outcome (calibration) along with the method of derivation and validation (application on a new cohort) were extracted.</p> <p>Results</p> <p>We identified 1,655 articles. Thirty were selected for further review and 10 were included in this review.</p> <p>Eight systems used vital signs as variables and two relied mostly on blood tests.</p> <p>Nine systems were derived using regression analysis and eight included patients admitted to a MAU. Six systems used in-hospital mortality as their primary endpoint.</p> <p>Discriminatory power was specified for eight of the scoring systems and was acceptable or better in five of these. The calibration was only specified for four scoring systems. In none of the studies impact analysis or inter-observer reliability were analyzed.</p> <p>None of the systems reached the highest level of evidence.</p> <p>Conclusions</p> <p>None of the 10 scoring systems presented in this article are perfect and all have their weaknesses. More research is needed before the use of scoring systems can be fully implemented to the risk assessment of acutely admitted medical patients.</p
Hypotension and hypocapnia during general anesthesia in piglets: study of S100b as an acute biomarker for cerebral tissue injury
BACKGROUND
Hypotension and/or hypocapnia might increase general anesthesia (GA)-related neuromorbidity in infants, but safe levels of perioperative blood pressure are poorly defined. Serum protein S100b has been used as screening, monitoring, and prediction tool in the management of patients with traumatic brain injury. Using an animal model, we investigated serum S100b as an acute biomarker of cerebral hypoperfusion and cerebral cell dysfunction during hypotension, hypocapnia, or combined hypotension/hypocapnia during GA.
METHODS
Fifty-seven sevoflurane-midazolam anesthetized piglets aged 4 to 6 weeks were randomly allocated to control (n=9), hypotension (n=18), hypocapnia (n=20), or combined hypotension and hypocapnia (n=10). Hypotension (target mean arterial blood pressure: 35 to 38 or 27 to 30 mm Hg) was induced by blood withdrawal and nitroprusside infusion, and hypocapnia by hyperventilation (target PaCO2: 28 to 30 and 23 to 25 mm Hg). Serum S100b and albumin were measured at baseline, before and 60 minutes after the interventions, and following 60-minute recovery.
RESULTS
Serum S100b concentrations decreased over time (P=0.001), but there was no difference in S100b between control piglets and those exposed to hypotension, hypocapnea, or a combination of the both (P=0.105). Albumin decreased in all 4 groups (P=0.001).
CONCLUSION
S100b did not increase following 60 minutes of systemic hypotension and/or hypocapnia during GA in piglets. In this setting, the use of S100b as a biomarker of cerebral cell tissue dysfunction cannot be supported
Heart Rate and Use of Beta-Blockers in Stable Outpatients with Coronary Artery Disease
<p><b>Background:</b> Heart rate (HR) is an emerging risk factor in coronary artery disease (CAD). However, there is little contemporary data regarding HR and the use of HR-lowering medications, particularly beta-blockers, among patients with stable CAD in routine clinical practice. The goal of the present analysis was to describe HR in such patients, overall and in relation to beta-blocker use, and to describe the determinants of HR.</p>
<p><b>Methods and Findings:</b> CLARIFY is an international, prospective, observational, longitudinal registry of outpatients with stable CAD, defined as prior myocardial infarction or revascularization procedure, evidence of coronary stenosis of >50%, or chest pain associated with proven myocardial ischemia. A total of 33,438 patients from 45 countries in Europe, the Americas, Africa, Middle East, and Asia/Pacific were enrolled between November 2009 and July 2010. Most of the 33,177 patients included in this analysis were men (77.5%). Mean (SD) age was 64.2 (10.5) years, HR by pulse was 68.3 (10.6) bpm, and by electrocardiogram was 67.2 (11.4) bpm. Overall, 44.0% had HR≥70 bpm. Beta-blockers were used in 75.1% of patients and another 14.4% had intolerance or contraindications to beta-blocker therapy. Among 24,910 patients on beta-blockers, 41.1% had HR≥70 bpm. HR≥70 bpm was independently associated with higher prevalence and severity of angina, more frequent evidence of myocardial ischemia, and lack of use of HR-lowering agents.</p>
<p><b>Conclusions:</b> Despite a high rate of use of beta-blockers, stable CAD patients often have resting HR≥70 bpm, which was associated with an overall worse health status, more frequent angina and ischemia. Further HR lowering is possible in many patients with CAD. Whether it will improve symptoms and outcomes is being tested.</p>
Impact of prior JAK-inhibitor therapy with ruxolitinib on outcome after allogeneic hematopoietic stem cell transplantation for myelofibrosis: a study of the CMWP of EBMT.
JAK1/2 inhibitor ruxolitinib (RUX) is approved in patients with myelofibrosis but the impact of pretreatment with RUX on outcome after allogeneic hematopoietic stem cell transplantation (HSCT) remains to be determined. We evaluated the impact of RUX on outcome in 551 myelofibrosis patients who received HSCT without (n = 274) or with (n = 277) RUX pretreatment. The overall leukocyte engraftment on day 45 was 92% and significantly higher in RUX responsive patients than those who had no or lost response to RUX (94% vs. 85%, p = 0.05). The 1-year non-relapse mortality was 22% without significant difference between the arms. In a multivariate analysis (MVA) RUX pretreated patients with ongoing spleen response at transplant had a significantly lower risk of relapse (8.1% vs. 19.1%; p = 0.04)] and better 2-year event-free survival (68.9% vs. 53.7%; p = 0.02) in comparison to patients without RUX pretreatment. For overall survival the only significant factors were age > 58 years (p = 0.03) and HLA mismatch donor (p = 0.001). RUX prior to HSCT did not negatively impact outcome after transplantation and patients with ongoing spleen response at time of transplantation had best outcome
Global monitoring of antimicrobial resistance based on metagenomics analyses of urban sewage
Antimicrobial resistance (AMR) is a serious threat to global public health, but obtaining representative data on AMR for healthy human populations is difficult. Here, we use meta-genomic analysis of untreated sewage to characterize the bacterial resistome from 79 sites in 60 countries. We find systematic differences in abundance and diversity of AMR genes between Europe/North-America/Oceania and Africa/Asia/South-America. Antimicrobial use data and bacterial taxonomy only explains a minor part of the AMR variation that we observe. We find no evidence for cross-selection between antimicrobial classes, or for effect of air travel between sites. However, AMR gene abundance strongly correlates with socio-economic, health and environmental factors, which we use to predict AMR gene abundances in all countries in the world. Our findings suggest that global AMR gene diversity and abundance vary by region, and that improving sanitation and health could potentially limit the global burden of AMR. We propose metagenomic analysis of sewage as an ethically acceptable and economically feasible approach for continuous global surveillance and prediction of AMR.Peer reviewe
Mid-life psychosocial work environment as a predictor of work exit by age 50.
OBJECTIVES: To examine whether psychosocial work characteristics at age 45 years predict exit from the labour market by the age of 50 years in data from the 1958 British Birth Cohort. METHODS: Psychosocial work characteristics (decision latitude, job demands, job strain and work social support at 45 years and job insecurity at 42 years) measured by questionnaire were linked to employment outcomes (unemployment, retirement, permanent sickness, homemaking) at 50 years in 6510 male and female participants. RESULTS: Low decision latitude (RR = 2.01, 95%CI 1.06,3.79), low work social support (RR = 1.96, 95%CI 1.12,3.44), and high job insecurity (RR = 2.27, 95%CI 1.41, 3.67) predicted unemployment at 50, adjusting for sex, housing tenure, socioeconomic status, marital status, and education. High demands were associated with lower risk of unemployment (RR = 0.50, 95%CI 0.29,0.88) but higher risk of permanent sickness (RR = 2.14, 95%CI 1.09,4.21). CONCLUSIONS: Keeping people in the workforce beyond 50 years may contribute to both personal and national prosperity. Employers may wish to improve working conditions for older workers, in particular, increase control over work, increase support and reduce demands to retain older employees in the workforce
Effectiveness of tip rotation in fibreoptic bronchoscopy under different experimental conditions: an in vitro crossover study
BACKGROUND:Proper manipulation of fibreoptic bronchoscopes is essential for successful tracheal intubation or diagnostic bronchoscopy. Failure of proper navigation and rotation of the fibrescope may lead to difficulties in advancing the fibrescope and might also be responsible for (unnecessary) difficulties and delays in fibreoptic tracheal intubation, with subsequent hypoxaemia. The present study, therefore, aimed to assess the effectiveness of tip rotation in flexible bronchoscopes in different experimental conditions.
METHODS: Five differently sized pairs of fibrescopes (outer diameters of 2.2, 2.4, 3.5, 4.2, and 5.2 mm) were inserted into paediatric airway manikins via an appropriately sized laryngeal mask and were turned clockwise or anticlockwise at the fibrescope body or cord to 45, 90, and 180°, with the cord held either straight or bent. The primary outcome measure was the ratio of rotation measured at the tip over the rotation performed with the fibrescope body or cord.
RESULTS: Overall, the 'body' turn was significantly less effective when a bent cord was present (mean difference ranging from 29.8% (95% confidence interval 8.8-50.9) to 117.4% (93.6-141.2). This difference was diminished when the 'cord' turn was performed. Smaller fibrescopes, with outer diameters of 2.2 and 2.4 mm, were inferior with respect to the transmission of 'body' rotation to the tip.
CONLUSIONS: 'Cord' turning of the fibrescope appears to be more effective in rotating the tip than a turn of the fibrescope 'body' only. Straightening the fibrescope cord and combined 'body' and 'cord' turning are recommended
The impact of analogue and digital visualisation tools on quality improvement in healthcare
No description supplie
Effects of hypothermia and hypothermia combined with hypocapnia on cerebral tissue oxygenation in piglets
Background: Hypothermia and its combination with hypocapnia are frequently associated with anesthesia.
Aims: The goal was to investigate the effects of hypothermia and hypothermia combined with hypocapnia (hypothermia-hypocapnia) on cerebral tissue oxygenation in anesthetized piglets.
Methods: Twenty anesthetized piglets were randomly allocated to hypothermia (n = 10) or hypothermia-hypocapnia (n = 10). Cerebral monitoring comprised a tissue oxygen partial pressure (PtO2 ), a laser Doppler probe, and a near-infrared spectroscopy sensor, measuring regional oxygen saturation (rSO2 ). After baseline recordings, hypothermia (35.5-36.0°C) with or without hypocapnia (target PaCO2 : 28-30 mm Hg) was induced. Once treatment goals were achieved (Tr0), they were maintained for 30 minutes (Tr30).
Results: No changes in PtO2 but a significant increase in rSO2 (Tr0 (mean difference 8.9[95% CI for difference3.99 to 13.81], P < .001); Tr30 (10.8[6.20 to 15.40], P < .001)) were detected during hypothermia. With hypothermia-hypocapnia, a decrease in PtO2 (Tr0 (-3.2[-6.01 to -0.39], P = .021; Tr30 (-3.3[-5.8 to -0.80], P = .006)) and no significant changes in rSO2 occurred. Cerebral blood flow decreased significantly from baseline to Tr0 independently of treatment (-0.89[-0.18 to -0.002], P = .042), but this was more consistently observed with hypothermia-hypocapnia.
Conclusions: The hypothermia-induced reduction in oxygen delivery was compensated by lowered metabolic demand. However, hypothermia was not able to compensate for an additional reduction in oxygen delivery caused by simultaneous hypocapnia. This resulted in a PtO2 drop, which was not reflected by a downshift in rSO2
- …