552 research outputs found

    TCT-142 Clinical Characteristics and Outcomes of Patients Undergoing Transcatheter Aortic Valve Replacement With STS PROM of ≤3%

    Get PDF
    Background Transcatheter aortic valve replacement (TAVR) is the preferred treatment for most patients with aortic stenosis deemed at intermediate or higher risk for surgical aortic valve replacement (SAVR). Determination of the risk of SAVR is mainly based on the Society of Thoracic Surgeons (STS) risk calculator. However, federal regulations permit the heart team latitude to offer TAVR for patients with ≤3% predicted risk of mortality (PROM) whose perceived risk is not adequately accounted for by the STS risk model. Limited data is available on the clinical characteristics and outcomes of these patients Methods The study group involved 2,539 patients who underwent TAVR from 2013 to 2017 within 7 hospitals in 5 Western states in the Providence St. Joseph Health system. The local TAVR site staff completed surveys identifying the clinical factors driving the heart team\u27s decision to proceed with TAVR. Clinical data was also collected per the TVT registry requirements. Results We identified 332 TAVR patients with STS PROM ≤3% and 2,207 patients with STS PROM \u3e3%. The percentage of TAVR patients with an STS PROM increased over time from 5.1% in 2013 to 16.6% in 2017. The most common factors (≥1 possible) influencing the heart team\u27s decision to proceed with TAVR in the ≤3% STS PROM group were frailty (63%), hostile chest (23%), severe lung disease (14%), morbid obesity (10%), and liver disease (8%). The baseline characteristics and outcomes of both groups are listed in the Table. Conclusion The proportion of TAVR patients with STS PROM ≤3% tripled from 2013 to 2017. In comparison to those with STS PROM \u3e3%, they were younger and more often men. The most common reasons driving the decision to favor TAVR over SAVR were frailty, hostile chest, and severe lung disease. TAVR patients with STS PROM ≤3% had shorter hospital stays and were more likely to be alive at 1 year

    Trends in Diagnosis Related Groups for inpatient admissions and associated changes in payment from 2012 to 2016

    Get PDF
    Importance: Hospitals are reimbursed based on Diagnosis Related Groups (DRGs), which are defined, in part, by patients having 1 or more complications or comorbidities within a given DRG family. Hospitals have made substantial investment in efforts to document these complications and comorbidities. Objective: To examine temporal trends in DRGs with a major complication or comorbidity, compare these findings with 2 alternative measures of disease severity, and estimate associated changes in payment. Design, Setting, and Participants: This retrospective cohort study used data from the all-payer National Inpatient Sample for admissions assigned to 1 of the top 20 reimbursed DRG families at US acute care hospitals from January 1, 2012, to December 31, 2016. Data were analyzed from July 10, 2018, to May 29, 2019. Exposures: Quarter year of hospitalization. Main Outcomes and Measures: The primary outcome was the proportion of DRGs with a major complication or comorbidity. Secondary outcomes were comorbidity scores, risk-adjusted mortality rates, and estimated payment. Changes in assigned DRGs, comorbidity scores, and risk-adjusted mortality rates were analyzed by linear regression. Payment changes were estimated for each DRG by calculating the Centers for Medicare & Medicaid Services weighted payment using 2012 and 2016 case mix and hospitalization counts. Results: Between 2012 and 2016, there were 62 167 976 hospitalizations for the 20 highest-reimbursed DRG families; the sample was 32.9% male and 66.8% White, with a median age of 57 years (interquartile range, 31-73 years). Within 15 of these DRG families (75%), the proportion of DRGs with a major complication or comorbidity increased significantly over time. Over the same period, comorbidity scores were largely stable, with a decrease in 6 DRG families (30%), no change in 10 (50%), and an increase in 4 (20%). Among 19 DRG families with a calculable mortality rate, the risk-adjusted mortality rate significantly decreased in 8 (42%), did not change in 9 (47%), and increased in 2 (11%). The observed DRG shifts were associated with at least $1.2 billion in increased payment. Conclusions and Relevance: In this cohort study, between 2012 and 2016, the proportion of admissions assigned to a DRG with major complication or comorbidity increased for 15 of the top 20 reimbursed DRG families. This change was not accompanied by commensurate increases in disease severity but was associated with increased payment

    The Narcotic Bowel Syndrome: Clinical Features, Pathophysiology, and Management

    Get PDF
    Narcotic bowel syndrome (NBS) is a subset of opioid bowel dysfunction that is characterized by chronic or frequently recurring abdominal pain that worsens with continued or escalating dosages of narcotics. This syndrome is under recognized and may be becoming more prevalent. This may be due in the United States to increases in using narcotics for chronic non-malignant painful disorders, and the development of maladaptive therapeutic interactions around its use. NBS can occur in patients with no prior gastrointestinal disorder who receive high dosages of narcotics after surgery or acute painful problems, among patients with functional GI disorders or other chronic gastrointestinal diseases who are managed by physicians unaware of the hyperalgesic effects of chronic opioids. The evidence for the enhanced pain perception is based on: a) activation of excitatory anti-analgesic pathways within a bimodal opioid regulation system, b) descending facilitation of pain at the Rostral Ventral Medulla and pain facilitation via dynorphin and CCK activation, and c) glial cell activation that produces morphine tolerance and enhances opioid induced pain. Treatment involves early recognition of the syndrome, an effective physician patient relationship, graded withdrawal of the narcotic according to a specified withdrawal program and the institution of medications to reduce withdrawal effects

    Early cost-utility analysis of tissue-engineered heart valves compared to bioprostheses in the aortic position in elderly patients

    Get PDF
    __Objectives:__ Aortic valve disease is the most frequent indication for heart valve replacement with the highest prevalence in elderly. Tissue-engineered heart valves (TEHV) are foreseen to have important advantages over currently used bioprosthetic heart valve substitutes, most importantly reducing valve degeneration with subsequent reduction of re-intervention. We performed early Health Technology Assessment of hypothetical TEHV in elderly patients (≥ 70 years) requiring surgical (SAVR) or transcatheter aortic valve implantation (TAVI) to assess the potential of TEHV and to inform future development decisions. __Methods:__ Using a patient-level simulation model, the potential cost-effectiveness of TEHV compared with bioprostheses was predicted from a societal perspective. Anticipated, but currently hypothetical improvements in performance of TEHV, divided in durability, thrombogenicity, and infection resistance, were explored in scenario analyses to estimate quality-adjusted life-year (QALY) gain, cost reduction, headroom, and budget impact. __Results:__ Durability of TEHV had the highest impact on QALY gain and costs, followed by infection resistance. Improved TEHV performance (− 50% prosthetic valve-related events) resulted in lifetime QALY gains of 0.131 and 0.043, lifetime cost reductions of €639 and €368, translating to headrooms of €3255 and €2498 per hypothetical TEHV compared to SAVR and TAVI, respectively. National savings in the first decade after implementation varied between €2.8 and €11.2 million (SAVR) and €3.2–€12.8 million (TAVI) for TEHV substitution rates of 25–100%. __Conclusions:__ Despite the relatively short life expectancy of elderly patients undergoing SAVR/TAVI, hypothetical TEHV are predicted to be cost-effective compared to bioprostheses, commercially viable and result in national cost savings when biomedical engineers succeed in realising improved durability and/or infection resistance of TEHV

    Multiplex immunofluorescence to measure dynamic changes in tumor-infiltrating lymphocytes and PD-L1 in early-stage breast cancer.

    Get PDF
    BACKGROUND: The H&E stromal tumor-infiltrating lymphocyte (sTIL) score and programmed death ligand 1 (PD-L1) SP142 immunohistochemistry assay are prognostic and predictive in early-stage breast cancer, but are operator-dependent and may have insufficient precision to characterize dynamic changes in sTILs/PD-L1 in the context of clinical research. We illustrate how multiplex immunofluorescence (mIF) combined with statistical modeling can be used to precisely estimate dynamic changes in sTIL score, PD-L1 expression, and other immune variables from a single paraffin-embedded slide, thus enabling comprehensive characterization of activity of novel immunotherapy agents. METHODS: Serial tissue was obtained from a recent clinical trial evaluating loco-regional cytokine delivery as a strategy to promote immune cell infiltration and activation in breast tumors. Pre-treatment biopsies and post-treatment tumor resections were analyzed by mIF (PerkinElmer Vectra) using an antibody panel that characterized tumor cells (cytokeratin-positive), immune cells (CD3, CD8, CD163, FoxP3), and PD-L1 expression. mIF estimates of sTIL score and PD-L1 expression were compared to the H&E/SP142 clinical assays. Hierarchical linear modeling was utilized to compare pre- and post-treatment immune cell expression, account for correlation of time-dependent measurement, variation across high-powered magnification views within each subject, and variation between subjects. Simulation methods (Monte Carlo, bootstrapping) were used to evaluate the impact of model and tissue sample size on statistical power. RESULTS: mIF estimates of sTIL and PD-L1 expression were strongly correlated with their respective clinical assays (p \u3c .001). Hierarchical linear modeling resulted in more precise estimates of treatment-related increases in sTIL, PD-L1, and other metrics such as CD8+ tumor nest infiltration. Statistical precision was dependent on adequate tissue sampling, with at least 15 high-powered fields recommended per specimen. Compared to conventional t-testing of means, hierarchical linear modeling was associated with substantial reductions in enrollment size required (n = 25➔n = 13) to detect the observed increases in sTIL/PD-L1. CONCLUSION: mIF is useful for quantifying treatment-related dynamic changes in sTILs/PD-L1 and is concordant with clinical assays, but with greater precision. Hierarchical linear modeling can mitigate the effects of intratumoral heterogeneity on immune cell count estimations, allowing for more efficient detection of treatment-related pharmocodynamic effects in the context of clinical trials. TRIAL REGISTRATION: NCT02950259

    A new scoring system to determine thromboembolic risk after heart valve replacement

    Get PDF
    Objective— To determine the most important inflammatory and hematologic predictors of thromboembolism (TE) in patients undergoing valve replacement (VR) to be used in conjunction with clinical risk factors for preoperative risk profiling. Methods and Results— Preoperative and immediately postoperative clinical, echocardiographic, hematologic, biochemical and microbiological parameters were examined prospectively in 370 patients undergoing VR (249 AVR, 93 MVR, 28 DVR). Mean follow-up was 4.4 years (max 6.6 years; total 1566 pt/yrs), and 96 TE events were documented (28 major and 68 minor). INR data were collected on all patients. Laboratory values were considered elevated if they exceeded the 80th percentile of those of 70 controls with the same distribution of age and gender. IgA antibody to Chlamydia pneumoniae (CP)≥1:64 was considered indicative of significant infection. Predictors of TE on multivariate analysis following AVR were (hazard ratios): CP infection (2.6), previous TE (2.5), raised eosinophils (2.4), cancer history (2.1), postoperative infection (2.0), hypertension (2.0), CABG × 3/4 (2.0), and diabetes (1.9). Predictors of TE following MVR/DVR were raised mean platelet volume (4.0), raised factor VII (3.1), CP infection (2.7), previous mitral valvotomy (2.5), raised fibrinogen (2.2), and raised reticulocytes (2.0). These risk factors were additive when present in the same patient, enabling a scoring system to be developed that accurately predicted risk of TE based on number of risk factors. Conclusions— Selected blood tests and clinical risk factors provide a scoring system that accurately predicts TE risk and may guide prosthesis choice and antithrombotic management

    Co-expression of CD39 and CD103 identifies tumor-reactive CD8 T cells in human solid tumors.

    Get PDF
    Identifying tumor antigen-specific T cells from cancer patients has important implications for immunotherapy diagnostics and therapeutics. Here, we show that CD103+CD39+ tumor-infiltrating CD8 T cells (CD8 TIL) are enriched for tumor-reactive cells both in primary and metastatic tumors. This CD8 TIL subset is found across six different malignancies and displays an exhausted tissue-resident memory phenotype. CD103+CD39+ CD8 TILs have a distinct T-cell receptor (TCR) repertoire, with T-cell clones expanded in the tumor but present at low frequencies in the periphery. CD103+CD39+ CD8 TILs also efficiently kill autologous tumor cells in a MHC-class I-dependent manner. Finally, higher frequencies of CD103+CD39+ CD8 TILs in patients with head and neck cancer are associated with better overall survival. Our data thus describe an approach for detecting tumor-reactive CD8 TILs that will help define mechanisms of existing immunotherapy treatments, and may lead to future adoptive T-cell cancer therapies
    • …
    corecore