72 research outputs found
Hospital variation in transfusion and infection after cardiac surgery: a cohort study
<p>Abstract</p> <p>Background</p> <p>Transfusion practices in hospitalised patients are being re-evaluated, in part due to studies indicating adverse effects in patients receiving large quantities of stored blood. Concomitant with this re-examination have been reports showing variability in the use of specific blood components. This investigation was designed to assess hospital variation in blood use and outcomes in cardiac surgery patients.</p> <p>Methods</p> <p>We evaluated outcomes in 24,789 Medicare beneficiaries in the state of Michigan, USA who received coronary artery bypass graft surgery from 2003 to 2006. Using a cohort design, patients were followed from hospital admission to assess transfusions, in-hospital infection and mortality, as well as hospital readmission and mortality 30 days after discharge. Multilevel mixed-effects logistic regression was used to calculate the intrahospital correlation coefficient (for 40 hospitals) and compare outcomes by transfusion status.</p> <p>Results</p> <p>Overall, 30% (95 CI, 20% to 42%) of the variance in transfusion practices was attributable to hospital site. Allogeneic blood use by hospital ranged from 72.5% to 100% in women and 49.7% to 100% in men. Allogeneic, but not autologous, blood transfusion increased the odds of in-hospital infection 2.0-fold (95% CI 1.6 to 2.5), in-hospital mortality 4.7-fold (95% CI 2.4 to 9.2), 30-day readmission 1.4-fold (95% CI 1.2 to 1.6), and 30-day mortality 2.9-fold (95% CI 1.4 to 6.0) in elective surgeries. Allogeneic transfusion was associated with infections of the genitourinary system, respiratory tract, bloodstream, digestive tract and skin, as well as infection with <it>Clostridium difficile</it>. For each 1% increase in hospital transfusion rates, there was a 0.13% increase in predicted infection rates.</p> <p>Conclusion</p> <p>Allogeneic blood transfusion was associated with an increased risk of infection at multiple sites, suggesting a system-wide immune response. Hospital variation in transfusion practices after coronary artery bypass grafting was considerable, indicating that quality efforts may be able to influence practice and improve outcomes.</p
Hereditary angioedema: beyond international consensus - circa December 2010 - The Canadian Society of Allergy and Clinical Immunology Dr. David McCourtie Lecture
<p>Abstract</p> <p>Background</p> <p>The 2010 International Consensus Algorithm for the Diagnosis, Therapy and Management of Hereditary Angioedema was published earlier this year in this Journal (Bowen et al. <it>Allergy, Asthma & Clinical Immunology </it>2010, 6:24 - <url>http://www.aacijournal.com/content/6/1/24</url>). Since that publication, there have been multiple phase III clinical trials published on either prophylaxis or therapy of hereditary angioedema and some of these products have changed approval status in various countries. This manuscript was prepared to review and update the management of hereditary angioedema.</p> <p>Objective</p> <p>To review approaches for the diagnosis and management of hereditary angioedema (HAE) circa December 2010 and present thoughts on moving from HAE management from international evidence-based consensus to facilitate more local health unit considerations balancing costs, efficacies of treatments, and risk benefits. Thoughts will reflect Canadian and international experiences.</p> <p>Methods</p> <p>PubMed searches including hereditary angioedema and diagnosis, therapy, management and consensus were reviewed as well as press releases from various pharmaceutical companies to early December 2010.</p> <p>Results</p> <p>The 2010 International Consensus Algorithms for the Diagnosis, Therapy and Management of Hereditary Angioedema is reviewed in light of the newly published phase III Clinical trials for prevention and therapy of HAE. Management approaches and models are discussed.</p> <p>Conclusions</p> <p>Consensus approach and double-blind placebo controlled trials are only interim guides to a complex disorder such as HAE and should be replaced as soon as possible with large phase IV clinical trials, meta analyses, data base registry validation of approaches including quality of life and cost benefit analyses, safety, and head-to-head clinical trials investigating superiority or non-inferiority comparisons of available approaches. Since not all therapeutic products are available in all jurisdictions and since health care delivery approaches and philosophy vary between countries, each health care delivery sector will likely devise their own algorithms based on local practicalities for implementing evidence-based guidelines and standards for HAE disease management. Quality-of-life and cost affordability benefit conclusions will likely vary between countries and health care units. Data base registries for rare disorders like HAE should be used to detect early adverse events for new therapies and to facilitate phase IV clinical trials and encourage superiority and non-inferiority comparisons of HAE management approaches.</p
Relationship between cardiac deformation parameters measured by cardiovascular magnetic resonance and aerobic fitness in endurance athletes
Background: Athletic training leads to remodelling of both left and right ventricles with increased myocardial mass and cavity dilatation. Whether changes in cardiac strain parameters occur in response to training is less well established. In this study we investigated the relationship in trained athletes between cardiovascular magnetic resonance (CMR) derived strain parameters of cardiac function and fitness. Methods: 35 endurance athletes and 35 age and sex matched controls underwent CMR at 3.0T including cine imaging in multiple planes and tissue tagging by spatial modulation of magnetization (SPAMM). CMR data were analysed quantitatively reporting circumferential strain and torsion from tagged images and left and right ventricular longitudinal strain from feature tracking of cine images. Athletes performed a maximal ramp-incremental exercise test to determine the lactate threshold (LT) and maximal oxygen uptake (VÌO2max). Results: LV circumferential strain at all levels, LV twist and torsion, LV late diastolic longitudinal strain rate, RV peak longitudinal strain and RV early and late diastolic longitudinal strain rate were all lower in athletes than controls. On multivariable linear regression only LV torsion (beta=-0.37, P=0.03) had a significant association with LT. Only RV longitudinal late diastolic strain rate (beta=-0.35, P=0.03) had a significant association with VÌO2max. Conclusions: This cohort of endurance athletes had lower LV circumferential strain, LV torsion and biventricular diastolic strain rates than controls. Increased LT, which is a major determinant of performance in endurance athletes, was associated with decreased LV torsion. Further work is needed to understand the mechanisms by which this occurs
Regulators of genetic risk of breast cancer identified by integrative network analysis.
Genetic risk for breast cancer is conferred by a combination of multiple variants of small effect. To better understand how risk loci might combine, we examined whether risk-associated genes share regulatory mechanisms. We created a breast cancer gene regulatory network comprising transcription factors and groups of putative target genes (regulons) and asked whether specific regulons are enriched for genes associated with risk loci via expression quantitative trait loci (eQTLs). We identified 36 overlapping regulons that were enriched for risk loci and formed a distinct cluster within the network, suggesting shared biology. The risk transcription factors driving these regulons are frequently mutated in cancer and lie in two opposing subgroups, which relate to estrogen receptor (ER)(+) luminal A or luminal B and ER(-) basal-like cancers and to different luminal epithelial cell populations in the adult mammary gland. Our network approach provides a foundation for determining the regulatory circuits governing breast cancer, to identify targets for intervention, and is transferable to other disease settings.This work was funded by Cancer Research UK and the Breast Cancer Research Foundation. MAAC is funded by the National Research Council (CNPq) of Brazil. TEH held a fellowship from the US DOD Breast Cancer Research Program (W81XWH-11-1-0592) and is currently supported by an RAH Career Development Fellowship (Australia). TEH and WDT are funded by the NHMRC of Australia (NHMRC) (ID: 1008349 WDT; 1084416 WDT, TEH) and Cancer Australia/National Breast Cancer Foundation (ID 627229; WDT, TEH). BAJP is a Gibb Fellow of Cancer Research UK. We would like to acknowledge the support of The University of Cambridge, Cancer Research UK and Hutchison Whampoa Limited.This is the author accepted manuscript. The final version is available from NPG via http://dx.doi.org/10.1038/ng.345
Recommended from our members
The gut microbiota: a major player in the toxicity of environmental pollutants?
Exposure to environmental chemicals has been linked to various health disorders, including obesity, type 2 diabetes, cancer and dysregulation of the immune and reproductive systems, whereas the gastrointestinal microbiota critically contributes to a variety of host metabolic and immune functions. We aimed to evaluate the bidirectional relationship between gut bacteria and environmental pollutants and to assess the toxicological relevance of the bacteriaâxenobiotic interplay for the host. We examined studies using isolated bacteria, faecal or caecal suspensionsâgerm-free or antibiotic-treated animalsâas well as animals reassociated with a microbiota exposed to environmental chemicals. The literature indicates that gut microbes have an extensive capacity to metabolise environmental chemicals that can be classified in five core enzymatic families (azoreductases, nitroreductases, ÎČ-glucuronidases, sulfatases and ÎČ-lyases) unequivocally involved in the metabolism of >30 environmental contaminants. There is clear evidence that bacteria-dependent metabolism of pollutants modulates the toxicity for the host. Conversely, environmental contaminants from various chemical families have been shown to alter the composition and/or the metabolic activity of the gastrointestinal bacteria, which may be an important factor contributing to shape an individualâs microbiotype. The physiological consequences of these alterations have not been studied in details but pollutant-induced alterations of the gut bacteria are likely to contribute to their toxicity. In conclusion, there is a body of evidence suggesting that gut microbiota are a major, yet underestimated element that must be considered to fully evaluate the toxicity of environmental contaminants
Myocardial tagging by Cardiovascular Magnetic Resonance: evolution of techniques--pulse sequences, analysis algorithms, and applications
Cardiovascular magnetic resonance (CMR) tagging has been established as an essential technique for measuring regional myocardial function. It allows quantification of local intramyocardial motion measures, e.g. strain and strain rate. The invention of CMR tagging came in the late eighties, where the technique allowed for the first time for visualizing transmural myocardial movement without having to implant physical markers. This new idea opened the door for a series of developments and improvements that continue up to the present time. Different tagging techniques are currently available that are more extensive, improved, and sophisticated than they were twenty years ago. Each of these techniques has different versions for improved resolution, signal-to-noise ratio (SNR), scan time, anatomical coverage, three-dimensional capability, and image quality. The tagging techniques covered in this article can be broadly divided into two main categories: 1) Basic techniques, which include magnetization saturation, spatial modulation of magnetization (SPAMM), delay alternating with nutations for tailored excitation (DANTE), and complementary SPAMM (CSPAMM); and 2) Advanced techniques, which include harmonic phase (HARP), displacement encoding with stimulated echoes (DENSE), and strain encoding (SENC). Although most of these techniques were developed by separate groups and evolved from different backgrounds, they are in fact closely related to each other, and they can be interpreted from more than one perspective. Some of these techniques even followed parallel paths of developments, as illustrated in the article. As each technique has its own advantages, some efforts have been made to combine different techniques together for improved image quality or composite information acquisition. In this review, different developments in pulse sequences and related image processing techniques are described along with the necessities that led to their invention, which makes this article easy to read and the covered techniques easy to follow. Major studies that applied CMR tagging for studying myocardial mechanics are also summarized. Finally, the current article includes a plethora of ideas and techniques with over 300 references that motivate the reader to think about the future of CMR tagging
Mapping geographical inequalities in oral rehydration therapy coverage in low-income and middle-income countries, 2000-17
Background Oral rehydration solution (ORS) is a form of oral rehydration therapy (ORT) for diarrhoea that has the potential to drastically reduce child mortality; yet, according to UNICEF estimates, less than half of children younger than 5 years with diarrhoea in low-income and middle-income countries (LMICs) received ORS in 2016. A variety of recommended home fluids (RHF) exist as alternative forms of ORT; however, it is unclear whether RHF prevent child mortality. Previous studies have shown considerable variation between countries in ORS and RHF use, but subnational variation is unknown. This study aims to produce high-resolution geospatial estimates of relative and absolute coverage of ORS, RHF, and ORT (use of either ORS or RHF) in LMICs. Methods We used a Bayesian geostatistical model including 15 spatial covariates and data from 385 household surveys across 94 LMICs to estimate annual proportions of children younger than 5 years of age with diarrhoea who received ORS or RHF (or both) on continuous continent-wide surfaces in 2000-17, and aggregated results to policy-relevant administrative units. Additionally, we analysed geographical inequality in coverage across administrative units and estimated the number of diarrhoeal deaths averted by increased coverage over the study period. Uncertainty in the mean coverage estimates was calculated by taking 250 draws from the posterior joint distribution of the model and creating uncertainty intervals (UIs) with the 2 center dot 5th and 97 center dot 5th percentiles of those 250 draws. Findings While ORS use among children with diarrhoea increased in some countries from 2000 to 2017, coverage remained below 50% in the majority (62 center dot 6%; 12 417 of 19 823) of second administrative-level units and an estimated 6 519 000 children (95% UI 5 254 000-7 733 000) with diarrhoea were not treated with any form of ORT in 2017. Increases in ORS use corresponded with declines in RHF in many locations, resulting in relatively constant overall ORT coverage from 2000 to 2017. Although ORS was uniformly distributed subnationally in some countries, within-country geographical inequalities persisted in others; 11 countries had at least a 50% difference in one of their units compared with the country mean. Increases in ORS use over time were correlated with declines in RHF use and in diarrhoeal mortality in many locations, and an estimated 52 230 diarrhoeal deaths (36 910-68 860) were averted by scaling up of ORS coverage between 2000 and 2017. Finally, we identified key subnational areas in Colombia, Nigeria, and Sudan as examples of where diarrhoeal mortality remains higher than average, while ORS coverage remains lower than average. Interpretation To our knowledge, this study is the first to produce and map subnational estimates of ORS, RHF, and ORT coverage and attributable child diarrhoeal deaths across LMICs from 2000 to 2017, allowing for tracking progress over time. Our novel results, combined with detailed subnational estimates of diarrhoeal morbidity and mortality, can support subnational needs assessments aimed at furthering policy makers' understanding of within-country disparities. Over 50 years after the discovery that led to this simple, cheap, and life-saving therapy, large gains in reducing mortality could still be made by reducing geographical inequalities in ORS coverage. Copyright (c) 2020 The Author(s). Published by Elsevier Ltd. This is an Open Access article under the CC BY 4.0 license.Peer reviewe
Laparoscopy in management of appendicitis in high-, middle-, and low-income countries: a multicenter, prospective, cohort study.
BACKGROUND: Appendicitis is the most common abdominal surgical emergency worldwide. Differences between high- and low-income settings in the availability of laparoscopic appendectomy, alternative management choices, and outcomes are poorly described. The aim was to identify variation in surgical management and outcomes of appendicitis within low-, middle-, and high-Human Development Index (HDI) countries worldwide. METHODS: This is a multicenter, international prospective cohort study. Consecutive sampling of patients undergoing emergency appendectomy over 6Â months was conducted. Follow-up lasted 30Â days. RESULTS: 4546 patients from 52 countries underwent appendectomy (2499 high-, 1540 middle-, and 507 low-HDI groups). Surgical site infection (SSI) rates were higher in low-HDI (OR 2.57, 95% CI 1.33-4.99, pâ=â0.005) but not middle-HDI countries (OR 1.38, 95% CI 0.76-2.52, pâ=â0.291), compared with high-HDI countries after adjustment. A laparoscopic approach was common in high-HDI countries (1693/2499, 67.7%), but infrequent in low-HDI (41/507, 8.1%) and middle-HDI (132/1540, 8.6%) groups. After accounting for case-mix, laparoscopy was still associated with fewer overall complications (OR 0.55, 95% CI 0.42-0.71, pâ<â0.001) and SSIs (OR 0.22, 95% CI 0.14-0.33, pâ<â0.001). In propensity-score matched groups within low-/middle-HDI countries, laparoscopy was still associated with fewer overall complications (OR 0.23 95% CI 0.11-0.44) and SSI (OR 0.21 95% CI 0.09-0.45). CONCLUSION: A laparoscopic approach is associated with better outcomes and availability appears to differ by country HDI. Despite the profound clinical, operational, and financial barriers to its widespread introduction, laparoscopy could significantly improve outcomes for patients in low-resource environments. TRIAL REGISTRATION: NCT02179112
The impact of surgical delay on resectability of colorectal cancer: An international prospective cohort study
AIM: The SARS-CoV-2 pandemic has provided a unique opportunity to explore the impact of surgical delays on cancer resectability. This study aimed to compare resectability for colorectal cancer patients undergoing delayed versus non-delayed surgery. METHODS: This was an international prospective cohort study of consecutive colorectal cancer patients with a decision for curative surgery (January-April 2020). Surgical delay was defined as an operation taking place more than 4âweeks after treatment decision, in a patient who did not receive neoadjuvant therapy. A subgroup analysis explored the effects of delay in elective patients only. The impact of longer delays was explored in a sensitivity analysis. The primary outcome was complete resection, defined as curative resection with an R0 margin. RESULTS: Overall, 5453 patients from 304 hospitals in 47 countries were included, of whom 6.6% (358/5453) did not receive their planned operation. Of the 4304 operated patients without neoadjuvant therapy, 40.5% (1744/4304) were delayed beyond 4âweeks. Delayed patients were more likely to be older, men, more comorbid, have higher body mass index and have rectal cancer and early stage disease. Delayed patients had higher unadjusted rates of complete resection (93.7% vs. 91.9%, PÂ =Â 0.032) and lower rates of emergency surgery (4.5% vs. 22.5%, Pâ<â0.001). After adjustment, delay was not associated with a lower rate of complete resection (OR 1.18, 95% CI 0.90-1.55, PÂ =Â 0.224), which was consistent in elective patients only (OR 0.94, 95% CI 0.69-1.27, PÂ =Â 0.672). Longer delays were not associated with poorer outcomes. CONCLUSION: One in 15 colorectal cancer patients did not receive their planned operation during the first wave of COVID-19. Surgical delay did not appear to compromise resectability, raising the hypothesis that any reduction in long-term survival attributable to delays is likely to be due to micro-metastatic disease
- âŠ