114 research outputs found

    Discrete Improvement in Racial Disparity in Survival among Patients with Stage IV Colorectal Cancer: a 21-Year Population-Based Analysis

    Get PDF
    Purpose Recently, multiple clinical trials have demonstrated improved outcomes in patients with metastatic colorectal cancer. This study investigated if the improved survival is race dependent. Patients and Methods Overall and cancer-specific survival of 77,490 White and Black patients with metastatic colorectal cancer from the 1988–2008 Surveillance Epidemiology and End Results registry were compared using unadjusted and multivariable adjusted Cox proportional hazard regression as well as competing risk analyses. Results Median age was 69 years, 47.4 % were female and 86.0 % White. Median survival was 11 months overall, with an overall increase from 8 to 14 months between 1988 and 2008. Overall survival increased from 8 to 14 months for White, and from 6 to 13 months for Black patients. After multivariable adjustment, the following parameters were associated with better survival: White, female, younger, better educated and married patients, patients with higher income and living in urban areas, patients with rectosigmoid junction and rectal cancer, undergoing cancer-directed surgery, having well/moderately differentiated, and N0 tumors (p<0.05 for all covariates). Discrepancies in overall survival based on race did not change significantly over time; however, there was a significant decrease of cancer-specific survival discrepancies over time between White and Black patients with a hazard ratio of 0.995 (95 % confidence interval 0.991–1.000) per year (p=0.03). Conclusion A clinically relevant overall survival increase was found from 1988 to 2008 in this population-based analysis for both White and Black patients with metastatic colorectal cancer. Although both White and Black patients benefitted from this improvement, a slight discrepancy between the two groups remained

    Discrete Improvement in Racial Disparity in Survival among Patients with Stage IV Colorectal Cancer: a 21-Year Population-Based Analysis

    Get PDF
    Purpose: Recently, multiple clinical trials have demonstrated improved outcomes in patients with metastatic colorectal cancer. This study investigated if the improved survival is race dependent. Patients and Methods: Overall and cancer-specific survival of 77,490 White and Black patients with metastatic colorectal cancer from the 1988-2008 Surveillance Epidemiology and End Results registry were compared using unadjusted and multivariable adjusted Cox proportional hazard regression as well as competing risk analyses. Results: Median age was 69years, 47.4% were female and 86.0% White. Median survival was 11months overall, with an overall increase from 8 to 14months between 1988 and 2008. Overall survival increased from 8 to 14months for White, and from 6 to 13months for Black patients. After multivariable adjustment, the following parameters were associated with better survival: White, female, younger, better educated and married patients, patients with higher income and living in urban areas, patients with rectosigmoid junction and rectal cancer, undergoing cancer-directed surgery, having well/moderately differentiated, and N0 tumors (p < 0.05 for all covariates). Discrepancies in overall survival based on race did not change significantly over time; however, there was a significant decrease of cancer-specific survival discrepancies over time between White and Black patients with a hazard ratio of 0.995 (95% confidence interval 0.991-1.000) per year (p = 0.03). Conclusion: A clinically relevant overall survival increase was found from 1988 to 2008 in this population-based analysis for both White and Black patients with metastatic colorectal cancer. Although both White and Black patients benefitted from this improvement, a slight discrepancy between the two groups remained

    The utility of 6-minute walk distance in predicting waitlist mortality for lung transplant candidates.

    Get PDF
    BACKGROUND The lung allocation score (LAS) has led to improved organ allocation for transplant candidates. At present, the 6-minute walk distance (6MWD) is treated as a binary categorical variable of whether or not a candidate can walk more than 150 feet in 6 minutes. In this study, we tested the hypothesis that 6MWD is presently under-utilized with respect to discriminatory power, and that, as a continuous variable, could better prognosticate risk of waitlist mortality. METHODS A retrospective cohort analysis was performed using the Organ Procurement and Transplantation Network/United Network for Organ Sharing (OPTN/UNOS) transplant database. Candidates listed for isolated lung transplant between May 2005 and December 2011 were included. The population was stratified by 6MWD quartiles and unadjusted survival rates were estimated. Multivariable Cox proportional hazards modeling was used to assess the effect of 6MWD on risk of death. The Scientific Registry of Transplant Recipients (SRTR) Waitlist Risk Model was used to adjust for confounders. The optimal 6MWD for discriminative accuracy in predicting waitlist mortality was assessed by receiver-operating characteristic (ROC) curves. RESULTS Analysis was performed on 12,298 recipients. Recipients were segregated into quartiles by distance walked. Waitlist mortality decreased as 6MWD increased. In the multivariable model, significant variables included 6MWD, male gender, non-white ethnicity and restrictive lung diseases. ROC curves discriminated 6-month mortality was best at 655 feet. CONCLUSIONS The 6MWD is a significant predictor of waitlist mortality. A cut-off of 150 feet sub-optimally identifies candidates with increased risk of mortality. A cut-off between 550 and 655 feet is more optimal if 6MWD is to be treated as a dichotomous variable. Utilization of the LAS as a continuous variable could further enhance predictive capabilities

    Does time of surgery influence the rate of false-negative appendectomies?:A retrospective observational study of 274 patients

    Get PDF
    Background Multiple disciplines have described an “after-hours effect” relating to worsened mortality and morbidity outside regular working hours. This retrospective observational study aimed to evaluate whether diagnostic accuracy of a common surgical condition worsened after regular hours. Methods Electronic operative records for all non-infant patients (age > 4 years) operated on at a single centre for presumed acute appendicitis were retrospectively reviewed over a 56-month period (06/17/2012–02/01/2017). The primary outcome measure of unknown diagnosis was compared between those performed in regular hours (08:00–17:00) or off hours (17:01–07:59). Pre-clinical biochemistry and pre-morbid status were recorded to determine case heterogeneity between the two groups, along with secondary outcomes of length of stay and complication rate. Results Out of 289 procedures, 274 cases were deemed eligible for inclusion. Of the 133 performed in regular hours, 79% were appendicitis, compared to 74% of the 141 procedures performed off hours. The percentage of patients with an unknown diagnosis was 6% in regular hours compared to 15% off hours (RR 2.48; 95% CI 1.14–5.39). This was accompanied by increased numbers of registrars (residents in training) leading procedures off hours (37% compared to 24% in regular hours). Pre-morbid status, biochemistry, length of stay and post-operative complication rate showed no significant difference. Conclusions This retrospective study suggests that the rate of unknown diagnoses for acute appendicitis increases overnight, potentially reflecting increased numbers of unnecessary procedures being performed off hours due to poorer diagnostic accuracy. Reduced levels of staffing, availability of diagnostic modalities and changes to workforce training may explain this, but further prospective work is required. Potential solutions may include protocolizing the management of common acute surgical conditions and making more use of non-resident on call senior colleagues

    System Dynamics to Model the Unintended Consequences of Denying Payment for Venous Thromboembolism after Total Knee Arthroplasty

    Get PDF
    Background: The Hospital Acquired Condition Strategy (HACS) denies payment for venous thromboembolism (VTE) after total knee arthroplasty (TKA). The intention is to reduce complications and associated costs, while improving the quality of care by mandating VTE prophylaxis. We applied a system dynamics model to estimate the impact of HACS on VTE rates, and potential unintended consequences such as increased rates of bleeding and infection and decreased access for patients who might benefit from TKA. Methods and Findings: The system dynamics model uses a series of patient stocks including the number needing TKA, deemed ineligible, receiving TKA, and harmed due to surgical complication. The flow of patients between stocks is determined by a series of causal elements such as rates of exclusion, surgery and complications. The number of patients harmed due to VTE, bleeding or exclusion were modeled by year by comparing patient stocks that results in scenarios with and without HACS. The percentage of TKA patients experiencing VTE decreased approximately 3-fold with HACS. This decrease in VTE was offset by an increased rate of bleeding and infection. Moreover, results from the model suggest HACS could exclude 1.5% or half a million patients who might benefit from knee replacement through 2020. Conclusion: System dynamics modeling indicates HACS will have the intended consequence of reducing VTE rates. However, an unintended consequence of the policy might be increased potential harm resulting from over administration of prophylaxis, as well as exclusion of a large population of patients who might benefit from TKA

    Challenges of modeling current very large lahars at Nevado del Huila Volcano, Colombia

    Get PDF
    Nevado del Huila, a glacier-covered volcano in the South of Colombia’s Cordillera Central, had not experienced any historical eruptions before 2007. In 2007 and 2008, the volcano erupted with phreatic and phreatomagmatic events which produced lahars with flow volumes of up to about 300 million m3 causing severe damage to infrastructure and loss of lives. The magnitude of these lahars and the prevailing potential for similar or even larger events, poses significant hazards to local people and makes appropriate modeling a real challenge. In this study, we analyze the recent lahars to better understand the main processes and then model possible scenarios for future events. We used lahar inundation depths, travel duration, and flow deposits to constrain the dimensions of the 2007 event and applied LAHARZ and FLO-2D for lahar modeling. Measured hydrographs, geophone seismic sensor data and calculated peak discharges served as input data for the reconstruction of flow hydrographs and for calibration of the models. For model validation, results were compared with field data collected along the Páez and Simbola Rivers. Based on the results of the 2007 lahar simulation, we modeled lahar scenarios with volumes between 300 million and 1 billion m3. The approach presented here represents a feasible solution for modeling high-magnitude flows like lahars and allows an assessment of potential future events and related consequences for population centers downstream of Nevado del Huila

    Trends in Treatment Patterns and Outcomes for Ductal Carcinoma In Situ

    Get PDF
    BACKGROUND Impact of contemporary treatment of pre-invasive breast cancer (ductal carcinoma in situ [DCIS]) on long-term outcomes remains poorly defined. We aimed to evaluate national treatment trends for DCIS and to determine their impact on disease-specific (DSS) and overall survival (OS). METHODS The Surveillance, Epidemiology, and End Results (SEER) registry was queried for patients diagnosed with DCIS from 1991 to 2010. Treatment pattern trends were analyzed using Cochran-Armitage trend test. Survival analyses were performed using inverse probability weights (IPW)-adjusted competing risk analyses for DSS and Cox proportional hazard regression for OS. All tests performed were two-sided. RESULTS One hundred twenty-one thousand and eighty DCIS patients were identified. The greatest proportion of patients was treated with lumpectomy and radiation therapy (43.0%), followed by lumpectomy alone (26.5%) and unilateral (23.8%) or bilateral mastectomy (4.5%) with significant shifts over time. The rate of sentinel lymph node biopsy increased from 9.7% to 67.1% for mastectomy and from 1.4% to 17.8% for lumpectomy. Compared with mastectomy, OS was higher for lumpectomy with radiation (hazard ratio [HR] = 0.79, 95% confidence interval [CI] = 0.76 to 0.83, P < .001) and lower for lumpectomy alone (HR = 1.17, 95% CI = 1.13 to 1.23, P < .001). IPW-adjusted ten-year DSS was highest in lumpectomy with XRT (98.9%), followed by mastectomy (98.5%), and lumpectomy alone (98.4%). CONCLUSIONS We identified substantial shifts in treatment patterns for DCIS from 1991 to 2010. When outcomes between locoregional treatment options were compared, we observed greater differences in OS than DSS, likely reflecting both a prevailing patient selection bias as well as clinically negligible differences in breast cancer outcomes between groups
    corecore