25 research outputs found

    Total arch replacement using moderate hypothermic circulatory arrest and unilateral selective antegrade cerebral perfusion

    Get PDF
    ObjectiveTo examine the clinical outcomes and impact of using moderate hypothermic circulatory arrest (MHCA) and unilateral selective antegrade cerebral perfusion (uSACP) in the setting of total aortic arch replacement (TOTAL).MethodsFrom 2004 to 2012, 733 patients underwent open arch reconstruction with MHCA and SACP. Of these, 145 (20%) underwent TOTAL. Measured outcomes included death, stroke, temporary neurologic dysfunction (TND), and renal failure. Mean follow-up time was 33 months and ranged from 0 to 95 months.ResultsCore temperature at the onset of MHCA was 25.8Ā°C. Cardiopulmonary bypass and myocardial ischemic times were 236 minutes and 181 minutes, respectively. Twenty-three patients (16%) underwent emergency repair of acute type A dissection. Fifty-four cases (37%) were reoperative and 52 (34%) were stage I elephant trunk procedures. Concomitant root replacement was performed in 50 (35%) patients, including 20 David V valve-sparing procedures. Mean duration of circulatory arrest was 55 minutes. Operative mortality was 9.7%. Overall incidence of stroke and TND was 2.8% and 5.6%, respectively. Four patients (2.8%) required postoperative dialysis. Seven-year survival was significantly reduced (PĀ =Ā .04) after repair of type A dissection (83.8%) compared with elective surgery (89.5%). Higher temperature during TOTAL was not found to be a significant risk factor for adverse events.ConclusionsTOTAL using MHCA and uSACP can be accomplished with excellent early and late results. MHCA was not associated with adverse neurologic outcomes or higher operative risk, despite prolonged periods of circulatory arrest

    Impact of varying degrees of renal dysfunction on transcatheter and surgical aortic valve replacement

    Get PDF
    BackgroundRenal impairment portends adverse outcomes in patients undergoing valvular heart surgery. TheĀ relationship between renal dysfunction in patients undergoing transcatheter aortic valve replacement (TAVR) is incompletely understood.MethodsA retrospective review of 1336 patients undergoing surgical aortic valve replacement (SAVR; 2002-2012) and 321 patients undergoing TAVR (2007-2012) was performed. Patients were divided into 3 glomerular filtration rate (GFR) groups: GFR greater than 60 mL/min, GFR 31 to 60 mL/min, and GFR 30 mL/min or less. Logistic and linear regression analysis was performed to estimate the TAVR effect on outcomes. Risk adjustments were made using the Society for Thoracic Surgeons (STS) predicted risk of mortality (PROM).ResultsTAVR patients were older (82 vs 65 years; PĀ <Ā .001), had a poorer ejection fraction (48% vs 53%; PĀ <Ā .001), were more likely female (45% vs 41%; PĀ =Ā .23), and had a higher STS PROM (11.9% vs 4.6%; PĀ <Ā .001). In-hospital mortality rates for TAVR and SAVR were 3.5% and 4.1%, respectively (PĀ =Ā .60), a result that marginally favors TAVR after risk adjustment (adjusted odds ratioĀ =Ā .52, PĀ =Ā .06). In SAVR patients, worsening preoperative renal failure was associated with increased in-hospital mortality (PĀ =Ā .004) and hospital (PĀ <Ā .001) and intensive care unit (ICU) (PĀ <Ā .001) lengths of stay. In contrast, worsening renal function did not influence in-hospital mortality (PĀ =Ā .78) and hospital (PĀ <Ā .23) and ICU (PĀ =Ā .88) lengths of stay in TAVR patients.ConclusionsWorsening renal function was associated with increased in-hospital mortality, hospital length of stay, and ICU length of stay in SAVR patients, but not in TAVR patients. This unexpected finding may haveĀ important clinical implications in patients with aortic stenosis and preoperative renal dysfunction

    Did Steroid Use Enhance the Performance of the Mitchell Batters? The Effect of Alleged Performance Enhancing Drug Use on Offensive Performance from 1995 to 2007

    No full text
    Introduction: The Mitchell Report to the Commissioner of Baseball sought to characterize the extent to which the use of performance enhancing drugs (PEDs) proliferated through baseball during the last 15 years. While the Report was not primarily initiated to expose individual players, it nonetheless contained detailed accounts of alleged PED abuse by 89 current and former players including seasons in which the abuse occurred and type of abuse (steroids or human growth hormone (HGH)). Previous analyses have largely focused on the impact of PED abuse on individual players (Barry Bonds and Roger Clemens, for instance). The present study integrates data from the Mitchell Report to make inferences about the overall effects of PED abuse on offensive production.Methods: The Lahman database was queried for all offensive seasons from 1995 to 2007 (minimum 50 PA, no pitchers). Runs created per 27 outs (RC27) was used as an estimate of the offensive production of a player in a season. An adjusted RC27 (ADJRC27) was obtained by accounting for career progression effects to reduce the influence of the expected change in performance over time due to age. Information from the Mitchell Report identified each player season as a PED season or a non-PED season. General linear mixed effects models were constructed that modeled ADJRC27 as a function of PED use (Yes or No). Multiple models were considered to assess the PED effect under various assumptions.Results: The baseline model estimated a mean non-steroid ADJRC27 during the study period of 4.58. The effect of steroid use was an additional 0.58 ADJRC27, an increase in production of 12.6% (p=0.0108). Additional models considered the effect of being a player mentioned in the Mitchell Report, adjustments for baseline performance, and the influential effect of Barry Bonds' performance. The estimated steroid effect ranged from 3.9% to 18.0% among twelve different models. Similar analysis of HGH use showed no evidence of performance improvement.Conclusions: This analysis suggests a significant and substantial performance advantage for players who used steroids during the study period. It is estimated that offensive production increased approximately 12% in steroid users versus non-users. This analysis represents the first attempt to quantify the overall effect of PED abuse on offensive performance in baseball.

    The impact of body mass index on morbidity and short- and long-term mortality in cardiac valvular surgery

    Get PDF
    ObjectiveLimited data exist on patients with cardiac cachexia or morbid obesity presenting for valvular heart surgery. The objective of this study was to investigate the relationship between body mass index and morbidity and mortality after valvular surgery.MethodsA retrospective review of 4247 patients undergoing valvular surgery from 1996 to 2008 at Emory University Healthcare Hospitals was performed. Patients were divided into 3 groups: body mass index 24 or less (group 1, nĀ =Ā 1527), body mass index 25 to 35 (group 2, nĀ =Ā 2284), and body mass index 36 or more (group 3, nĀ =Ā 436). Data were analyzed using multivariable regression analysis, adjusted for 10 preoperative covariates. A smooth kernel regression curve was generated using body mass index and in-hospital mortality as variables. Long-term survival comparisons were made using adjusted Cox proportional hazards regression models and Kaplanā€“Meier product-limit estimates. Kaplanā€“Meier curves were generated that provide survival estimates for long-term mortality using the Social Security Death Index.ResultsPatients in group 3 were significantly younger (group 1, 61.7 Ā± 16.1 years; group 2, 61.9 Ā± 13.6; group 3, 57.5 Ā± 13.0; PĀ <Ā .001) and more likely to be female (group 1, 778/1527 [51.0%]; group 2, 912/2284 [39.9%]; group 3, 240/436 [55.0%]; PĀ <Ā .001). Mean ejection fractions were similar among groups (PĀ =Ā .51). Patients in group 2 had significantly shorter postoperative length of stay (group 1, 9.6 Ā± 10.3 days; group 2, 8.7 Ā± 8.2 days; group 3, 10.8 Ā± 11.0 days; PĀ <Ā .001). In-hospital mortality for the entire cohort was 5.8% (245/4247), and by group was 111 of 1527 (7.3%) in group 1, 110 of 2284 (4.8%) in group 2, and 24 of 436 (5.5%) in group 3Ā (PĀ =Ā .006). Actual survival at 1, 3, 5, and 10 years was significantly lower in group 1 (PĀ <Ā .001). A lower body mass index was a significant independent predictor for both in-hospital and long-term mortality.ConclusionsPatients with body mass index 24 or less are at significantly increased risk of in-hospital and long-term mortality after cardiac valvular surgery. This high-risk patient population warrants careful risk stratification and options for less-invasive valve therapies

    New-Onset Atrial Fibrillation Predicts Long-Term Mortality After Coronary Artery Bypass Graft

    Get PDF
    ObjectivesWe sought to investigate the association between new-onset atrial fibrillation after coronary artery bypass graft (CABG) (post-operative atrial fibrillation [POAF]) and long-term mortality in patients with no history of atrial fibrillation.BackgroundPOAF predicts longer hospital stay and greater post-operative mortality.MethodsA total of 16,169 consecutive patients with no history of AF who underwent isolated CABG at our institution between January 1, 1996, and December 31, 2007, were included in the study. All-cause mortality data were obtained from Social Security Administration death records. A multivariable Cox proportional hazards regression model was constructed to determine the independent impact of new-onset POAF on long-term survival after adjusting for several covariates. The covariates included age, sex, race, pre-operative risk factors (ejection fraction, New York Heart Association functional class, history of myocardial infarction, index myocardial infarction, stroke, chronic obstructive pulmonary disease, peripheral arterial disease, smoking, diabetes, renal failure, hypertension, dyslipidemia, creatinine level, dialysis, redo surgery, elective versus emergent CABG, any valvular disorder) and post-operative adverse events (stroke, myocardial infarction, acute respiratory distress syndrome, and renal failure), and discharge cardiac medications known to affect survival in patients with coronary disease.ResultsNew-onset AF occurred in 2,985 (18.5%) patients undergoing CABG. POAF independently predicted long-term mortality (hazard ratio: 1.21; 95% confidence interval: 1.12 to 1.32) during a mean follow-up of 6 years (range 0 to 12.5 years). This association remained true after excluding from the analysis those patients who died in-hospital after surgery (hazard ratio: 1.21; 95% confidence interval: 1.11 to 1.32). Patients with POAF discharged on warfarin experienced reduced mortality during follow-up.ConclusionsIn this large cohort of patients, POAF predicted long-term mortality. Warfarin anticoagulation may improve survival in POAF

    Prospective Evaluation of Weight-Based Prophylactic Enoxaparin Dosing in Critically Ill Trauma Patients: Adequacy of AntiXa Levels is Improved.

    No full text
    Venous thromboembolism (VTE) is a leading cause of death in multisystem trauma patients; the importance of VTE prevention is well recognized. Presently, standard dose enoxaparin (30 mg BID) is used as chemical prophylaxis, regardless of weight or physiologic status. However, evidence suggests decreased bioavailability of enoxaparin in critically ill patients. Therefore, we hypothesized that a weight-based enoxaparin dosing regimen would provide more adequate prophylaxis (as indicated by antifactor Xa levels) for patients in our trauma intensive care unit (TICU).These data were prospectively collected in TICU patients admitted over a 5-month period given twice daily 0.6 mg/kg enoxaparin (actual body weight). Patients were compared with a historical cohort receiving standard dosing. Anti-Xa levels were collected at 11.5 hours (trough, goal ā‰„ 0.1 IU/mL) after each evening administration. Patient demographics, admission weight, dose, and daily anti-Xa levels were recorded. Patients with renal insufficiency or brain, spine, or spinal cord injury were excluded. Data were collected from 26 patients in the standard-dose group and 37 in the weight-based group. Sixty-four trough anti-Xa measurements were taken in the standard dose group and 74 collected in the weight-based group. Evaluating only levels measured after the third dose, the change in dosing of enoxaparin from 30 to 0.6 mg/kg resulted in an increased percentage of patients with goal antifactor Xa levels from 8 per cent to 61 per cent (P \u3c 0.0001). Examining all troughs, the change in dose resulted in an increase in patients with a goal anti-Xa level from 19 to 59 per cent (P \u3c 0.0001). Weight-based dosing of enoxaparin in trauma ICU patients yields superior results with respect to adequate anti-Xa levels when compared with standard dosing. These findings suggest that weight-based dosing may provide superior VTE prophylaxis in TICU patients. Evaluation of the effects of this dosing paradigm on actual VTE rate is ongoing at our institution

    American Association for the Surgery of Trauma Organ Injury Scale I: spleen, liver, and kidney, validation based on the National Trauma Data Bank.

    No full text
    BACKGROUND: This study attempts to validate the American Association for the Surgery of Trauma (AAST) Organ Injury Scale (OIS) for spleen, liver, and kidney injuries using the National Trauma Data Bank (NTDB). STUDY DESIGN: All NTDB entries with Abbreviated Injury Scale codes for spleen, liver, and kidney were classified by OIS grade. Injuries were stratified either as an isolated intraabdominal organ injury or in combination with other abdominal injuries. Isolated abdominal solid organ injuries were additionally stratified by presence of severe head injury and survival past 24 hours. The patients in each grading category were analyzed for mortality, operative rate, hospital length of stay, ICU length of stay, and charges incurred. RESULTS: There were 54,148 NTDB entries (2.7%) with Abbreviated Injury Scale-coded injuries to the spleen, liver, or kidney. In 35,897, this was an isolated abdominal solid organ injury. For patients in which the solid organ in question was not the sole abdominal injury, a statistically significant increase (p \u3c or = 0.05) in mortality, organ-specific operative rate, and hospital charges was associated with increasing OIS grade; the exception was grade VI hepatic injuries. Hospital and ICU lengths of stay did not show substantial increase with increasing OIS grade. When isolated organ injuries were examined, there were statistically significant increases (p \u3c or = 0.05) in all outcomes variables corresponding with increasing OIS grade. Severe head injury appears to influence mortality, but none of the other outcomes variables. Patients with other intraabdominal injuries had comparable quantitative outcomes results with the isolated abdominal organ injury groups for all OIS grades. CONCLUSIONS: This study validates and quantifies outcomes reflective of increasing injury severity associated with increasing OIS grades for specific solid organ injuries alone, and in combination with other abdominal injuries
    corecore