15 research outputs found

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Surviving Sepsis Campaign: international guidelines for management of severe sepsis and septic shock, 2012

    Get PDF
    OBJECTIVE: To provide an update to the "Surviving Sepsis Campaign Guidelines for Management of Severe Sepsis and Septic Shock," last published in 2008. DESIGN: A consensus committee of 68 international experts representing 30 international organizations was convened. Nominal groups were assembled at key international meetings (for those committee members attending the conference). A formal conflict of interest policy was developed at the onset of the process and enforced throughout. The entire guidelines process was conducted independent of any industry funding. A stand-alone meeting was held for all subgroup heads, co- and vice-chairs, and selected individuals. Teleconferences and electronic-based discussion among subgroups and among the entire committee served as an integral part of the development. METHODS: The authors were advised to follow the principles of the Grading of Recommendations Assessment, Development and Evaluation (GRADE) system to guide assessment of quality of evidence from high (A) to very low (D) and to determine the strength of recommendations as strong (1) or weak (2). The potential drawbacks of making strong recommendations in the presence of low-quality evidence were emphasized. Recommendations were classified into three groups: (1) those directly targeting severe sepsis; (2) those targeting general care of the critically ill patient and considered high priority in severe sepsis; and (3) pediatric considerations. RESULTS: Key recommendations and suggestions, listed by category, include: early quantitative resuscitation of the septic patient during the first 6 h after recognition (1C); blood cultures before antibiotic therapy (1C); imaging studies performed promptly to confirm a potential source of infection (UG); administration of broad-spectrum antimicrobials therapy within 1 h of the recognition of septic shock (1B) and severe sepsis without septic shock (1C) as the goal of therapy; reassessment of antimicrobial therapy daily for de-escalation, when appropriate (1B); infection source control with attention to the balance of risks and benefits of the chosen method within 12 h of diagnosis (1C); initial fluid resuscitation with crystalloid (1B) and consideration of the addition of albumin in patients who continue to require substantial amounts of crystalloid to maintain adequate mean arterial pressure (2C) and the avoidance of hetastarch formulations (1B); initial fluid challenge in patients with sepsis-induced tissue hypoperfusion and suspicion of hypovolemia to achieve a minimum of 30 mL/kg of crystalloids (more rapid administration and greater amounts of fluid may be needed in some patients (1C); fluid challenge technique continued as long as hemodynamic improvement is based on either dynamic or static variables (UG); norepinephrine as the first-choice vasopressor to maintain mean arterial pressure ≥65 mmHg (1B); epinephrine when an additional agent is needed to maintain adequate blood pressure (2B); vasopressin (0.03 U/min) can be added to norepinephrine to either raise mean arterial pressure to target or to decrease norepinephrine dose but should not be used as the initial vasopressor (UG); dopamine is not recommended except in highly selected circumstances (2C); dobutamine infusion administered or added to vasopressor in the presence of (a) myocardial dysfunction as suggested by elevated cardiac filling pressures and low cardiac output, or (b) ongoing signs of hypoperfusion despite achieving adequate intravascular volume and adequate mean arterial pressure (1C); avoiding use of intravenous hydrocortisone in adult septic shock patients if adequate fluid resuscitation and vasopressor therapy are able to restore hemodynamic stability (2C); hemoglobin target of 7-9 g/dL in the absence of tissue hypoperfusion, ischemic coronary artery disease, or acute hemorrhage (1B); low tidal volume (1A) and limitation of inspiratory plateau pressure (1B) for acute respiratory distress syndrome (ARDS); application of at least a minimal amount of positive end-expiratory pressure (PEEP) in ARDS (1B); higher rather than lower level of PEEP for patients with sepsis-induced moderate or severe ARDS (2C); recruitment maneuvers in sepsis patients with severe refractory hypoxemia due to ARDS (2C); prone positioning in sepsis-induced ARDS patients with a PaO (2)/FiO (2) ratio of ≤100 mm Hg in facilities that have experience with such practices (2C); head-of-bed elevation in mechanically ventilated patients unless contraindicated (1B); a conservative fluid strategy for patients with established ARDS who do not have evidence of tissue hypoperfusion (1C); protocols for weaning and sedation (1A); minimizing use of either intermittent bolus sedation or continuous infusion sedation targeting specific titration endpoints (1B); avoidance of neuromuscular blockers if possible in the septic patient without ARDS (1C); a short course of neuromuscular blocker (no longer than 48 h) for patients with early ARDS and a PaO (2)/FI O (2) 180 mg/dL, targeting an upper blood glucose ≤180 mg/dL (1A); equivalency of continuous veno-venous hemofiltration or intermittent hemodialysis (2B); prophylaxis for deep vein thrombosis (1B); use of stress ulcer prophylaxis to prevent upper gastrointestinal bleeding in patients with bleeding risk factors (1B); oral or enteral (if necessary) feedings, as tolerated, rather than either complete fasting or provision of only intravenous glucose within the first 48 h after a diagnosis of severe sepsis/septic shock (2C); and addressing goals of care, including treatment plans and end-of-life planning (as appropriate) (1B), as early as feasible, but within 72 h of intensive care unit admission (2C). Recommendations specific to pediatric severe sepsis include: therapy with face mask oxygen, high flow nasal cannula oxygen, or nasopharyngeal continuous PEEP in the presence of respiratory distress and hypoxemia (2C), use of physical examination therapeutic endpoints such as capillary refill (2C); for septic shock associated with hypovolemia, the use of crystalloids or albumin to deliver a bolus of 20 mL/kg of crystalloids (or albumin equivalent) over 5-10 min (2C); more common use of inotropes and vasodilators for low cardiac output septic shock associated with elevated systemic vascular resistance (2C); and use of hydrocortisone only in children with suspected or proven "absolute"' adrenal insufficiency (2C). CONCLUSIONS: Strong agreement existed among a large cohort of international experts regarding many level 1 recommendations for the best care of patients with severe sepsis. Although a significant number of aspects of care have relatively weak support, evidence-based recommendations regarding the acute management of sepsis and septic shock are the foundation of improved outcomes for this important group of critically ill patients

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone

    Patients with Crohn's disease have longer post-operative in-hospital stay than patients with colon cancer but no difference in complications' rate

    Get PDF
    BACKGROUNDRight hemicolectomy or ileocecal resection are used to treat benign conditions like Crohn's disease (CD) and malignant ones like colon cancer (CC).AIMTo investigate differences in pre- and peri-operative factors and their impact on post-operative outcome in patients with CC and CD.METHODSThis is a sub-group analysis of the European Society of Coloproctology's prospective, multi-centre snapshot audit. Adult patients with CC and CD undergoing right hemicolectomy or ileocecal resection were included. Primary outcome measure was 30-d post-operative complications. Secondary outcome measures were post-operative length of stay (LOS) at and readmission.RESULTSThree hundred and seventy-five patients with CD and 2,515 patients with CC were included. Patients with CD were younger (median = 37 years for CD and 71 years for CC (P &lt; 0.01), had lower American Society of Anesthesiology score (ASA) grade (P &lt; 0.01) and less comorbidity (P &lt; 0.01), but were more likely to be current smokers (P &lt; 0.01). Patients with CD were more frequently operated on by colorectal surgeons (P &lt; 0.01) and frequently underwent ileocecal resection (P &lt; 0.01) with higher rate of de-functioning/primary stoma construction (P &lt; 0.01). Thirty-day post-operative mortality occurred exclusively in the CC group (66/2515, 2.3%). In multivariate analyses, the risk of post-operative complications was similar in the two groups (OR 0.80, 95%CI: 0.54-1.17; P = 0.25). Patients with CD had a significantly longer LOS (Geometric mean 0.87, 95%CI: 0.79-0.95; P &lt; 0.01). There was no difference in re-admission rates. The audit did not collect data on post-operative enhanced recovery protocols that are implemented in the different participating centers.CONCLUSIONPatients with CD were younger, with lower ASA grade, less comorbidity, operated on by experienced surgeons and underwent less radical resection but had a longer LOS than patients with CC although complication's rate was not different between the two groups

    Polysaccharide Containing Gels for Pharmaceutical Applications

    No full text
    WOS: 000456875000007Bio-derived polymers are falling into the needs of pharmaceutical formulations for topical applications due to their gelling ability. Generally, in topical delivery, as an alternative way for local and systemic application of active substances, formulations in gelling form are preferred as they have multiple advantages, e.g., minimize systemic side effects, avoid gastrointestinal irritation, prevent the metabolism of the active substance in liver, etc. The present chapter reviews bio-based polymers with special reference to polysaccharides-based hydrogels with respect to their pharmaceutical applications

    Circus and Theatre, A tense relationship around the turn of the 20th century

    No full text
    Background: Measuring disease and injury burden in populations requires a composite metric that captures both premature mortality and the prevalence and severity of ill-health. The 1990 Global Burden of Disease study proposed disability-adjusted life years (DALYs) to measure disease burden. No comprehensive update of disease burden worldwide incorporating a systematic reassessment of disease and injury-specific epidemiology has been done since the 1990 study. We aimed to calculate disease burden worldwide and for 21 regions for 1990, 2005, and 2010 with methods to enable meaningful comparisons over time. Methods: We calculated DALYs as the sum of years of life lost (YLLs) and years lived with disability (YLDs). DALYs were calculated for 291 causes, 20 age groups, both sexes, and for 187 countries, and aggregated to regional and global estimates of disease burden for three points in time with strictly comparable definitions and methods. YLLs were calculated from age-sex-country-time-specific estimates of mortality by cause, with death by standardised lost life expectancy at each age. YLDs were calculated as prevalence of 1160 disabling sequelae, by age, sex, and cause, and weighted by new disability weights for each health state. Neither YLLs nor YLDs were age-weighted or discounted. Uncertainty around cause-specific DALYs was calculated incorporating uncertainty in levels of all-cause mortality, cause-specific mortality, prevalence, and disability weights. Findings: Global DALYs remained stable from 1990 (2·503 billion) to 2010 (2·490 billion). Crude DALYs per 1000 decreased by 23% (472 per 1000 to 361 per 1000). An important shift has occurred in DALY composition with the contribution of deaths and disability among children (younger than 5 years of age) declining from 41% of global DALYs in 1990 to 25% in 2010. YLLs typically account for about half of disease burden in more developed regions (high-income Asia Pacific, western Europe, high-income North America, and Australasia), rising to over 80% of DALYs in sub-Saharan Africa. In 1990, 47% of DALYs worldwide were from communicable, maternal, neonatal, and nutritional disorders, 43% from non-communicable diseases, and 10% from injuries. By 2010, this had shifted to 35%, 54%, and 11%, respectively. Ischaemic heart disease was the leading cause of DALYs worldwide in 2010 (up from fourth rank in 1990, increasing by 29%), followed by lower respiratory infections (top rank in 1990; 44% decline in DALYs), stroke (fifth in 1990; 19% increase), diarrhoeal diseases (second in 1990; 51% decrease), and HIV/AIDS (33rd in 1990; 351% increase). Major depressive disorder increased from 15th to 11th rank (37% increase) and road injury from 12th to 10th rank (34% increase). Substantial heterogeneity exists in rankings of leading causes of disease burden among regions. Interpretation: Global disease burden has continued to shift away from communicable to non-communicable diseases and from premature death to years lived with disability. In sub-Saharan Africa, however, many communicable, maternal, neonatal, and nutritional disorders remain the dominant causes of disease burden. The rising burden from mental and behavioural disorders, musculoskeletal disorders, and diabetes will impose new challenges on health systems. Regional heterogeneity highlights the importance of understanding local burden of disease and setting goals and targets for the post-2015 agenda taking such patterns into account. Because of improved definitions, methods, and data, these results for 1990 and 2010 supersede all previously published Global Burden of Disease results

    The impact of stapling technique and surgeon specialism on anastomotic failure after right?sided colorectal resection: an international multicentre, prospective audit

    Get PDF
    Aim There is little evidence to support choice of technique and configuration for stapled anastomoses after right hemicolectomy and ileocaecal resection. This study aimed to determine the relationship between stapling technique and anastomotic failure. Method Any unit performing gastrointestinal surgery was invited to contribute data on consecutive adult patients undergoing right hemicolectomy or ileocolic resection to this prospective, observational, international, multicentre study. Patients undergoing stapled, side?to?side ileocolic anastomoses were identified and multilevel, multivariable logistic regression analyses were performed to explore factors associated with anastomotic leak. Results One thousand three hundred and forty?seven patients were included from 200 centres in 32 countries. The overall anastomotic leak rate was 8.3%. Upon multivariate analysis there was no difference in leak rate with use of a cutting stapler for apical closure compared with a noncutting stapler (8.4% vs 8.0%, OR 0.91, 95% CI 0.54–1.53, P = 0.72). Oversewing of the apical staple line, whether in the cutting group (7.9% vs 9.7%, OR 0.87, 95% CI 0.52–1.46, P = 0.60) or noncutting group (8.9% vs 5.7%, OR 1.40, 95% CI 0.46–4.23, P = 0.55) also conferred no benefit in terms of reducing leak rates. Surgeons reporting to be general surgeons had a significantly higher leak rate than those reporting to be colorectal surgeons (12.1% vs 7.3%, OR 1.65, 95% CI 1.04–2.64, P = 0.04). Conclusion This study did not identify any difference in anastomotic leak rates according to the type of stapling device used to close the apical aspect. In addition, oversewing of the anastomotic staple lines appears to confer no benefit in terms of reducing leak rates. Although general surgeons operated on patients with more high?risk characteristics than colorectal surgeons, a higher leak rate for general surgeons which remained after risk adjustment needs further exploration
    corecore