29 research outputs found

    Anticoagulant Use, Safety and Effectiveness for Ischemic Stroke Prevention in Nursing Home Residents with Atrial Fibrillation

    Get PDF
    Background Fewer than one-third of nursing home residents with atrial fibrillation were treated with the only available oral anticoagulant, warfarin, historically. Management of atrial fibrillation has transformed in recent years with the approval of 4 direct-acting oral anticoagulants (DOACs) since 2010. Methods Using the national Minimum Data Set 3.0 linked to Medicare Part A and D claims, we first described contemporary (2011-2016) warfarin and DOAC utilization in the nursing home population (Aim 1). In Aim 2, we linked residents to nursing home and county level data to study associations between resident, facility, county, and state characteristics and anticoagulant treatment. Using a new-user active comparator design, we then compared the incidence of safety (i.e., bleeding), effectiveness (i.e., ischemic stroke), and mortality outcomes between residents initiating DOACs versus warfarin (Aim 3). Results The proportion of residents with atrial fibrillation receiving treatment increased from 42.3% in 2011 to 47.8% as of December 31, 2016, at which time 48.2% of treated residents received DOACs. Demographic and clinical characteristics of residents using DOACs and warfarin were similar in 2016. Half of the 8,734 DOAC users received standard dosages and most were treated with apixaban (54.4%) or rivaroxaban (35.8%) in 2016. Compared with warfarin, bleeding rates were lower and ischemic stroke rates were higher for apixaban users. Ischemic stroke and bleeding rates for dabigatran and rivaroxaban were comparable to warfarin. Mortality rates were lower versus warfarin for each DOAC. Conclusions In nursing homes, DOACs are being used commonly and with equal or greater benefit than warfarin

    Post-Acute Care Setting, Facility Characteristics, and Post-Stroke Outcomes: A Systematic Review

    Get PDF
    OBJECTIVE: To synthesize research comparing post-stroke health outcomes between patients rehabilitated in skilled nursing facilities (SNFs) and inpatient rehabilitation facilities (IRFs). Secondly, to evaluate relationships between facility characteristics and outcomes. DATA SOURCES: PubMed and CINAHL searches spanned January 1, 1998 to October 6, 2016 and encompassed MeSH and free-text keywords for stroke, IRF/SNF, and study outcomes. Human and English limits were used. STUDY SELECTION: Observational and experimental studies examining outcomes of adult stroke patients rehabilitated in an IRF or SNF were eligible. Studies had to provide site of care comparisons and/or analyses incorporating facility-level characteristics and had to report \u3e 1 primary outcome (discharge setting, functional status, readmission, quality of life, all-cause mortality). Unpublished, single-center, descriptive, and non-US studies were excluded. Articles were reviewed by one author and when uncertain, discussion with study coauthors achieved consensus. Fourteen (0.3%) titles were included. DATA EXTRACTION: The types of data, time period, size, design, and primary outcomes were extracted. We also extracted two secondary outcomes (length of IRF/SNF stay, cost) when reported by included studies. Effect measures, modeling approaches, methods for confounding adjustment, and potential confounders were extracted. Data were abstracted by one author and the accuracy verified by a second reviewer. DATA SYNTHESIS: Two studies evaluating community discharge, one study evaluating predicted readmission probability, and 3 studies evaluating all-cause mortality favored IRFs over SNFs. Functional status comparisons were inconsistent. No studies evaluated quality of life. Two studies confirmed increased costs in the IRF versus SNF setting. Although substantial facility variation was described, few studies characterized sources of variation. CONCLUSIONS: The few studies comparing post-stroke outcomes indicated better outcomes (with greater costs) for patients in IRFs versus SNFs. Contemporary research on the role of the post-acute care setting and its attributes in determining health outcomes should be prioritized to inform reimbursement system reform

    The patient burden of screening mammography recall.

    Get PDF
    OBJECTIVE: The aim of this article is to evaluate the burden of direct and indirect costs borne by recalled patients after a false positive screening mammogram. METHODS: Women aged 40-75 years undergoing screening mammography were identified from a U.S. commercial claims database. Women were required to have 12 months pre- and 6 months post-index enrollment to identify utilization and exclude patients with subsequent cancer diagnoses. Recall was defined as the use of diagnostic mammography or breast ultrasound during 6 months post-index. Descriptive statistics were presented for recalled and non-recalled patients; differences were compared using the chi square test. Out-of-pocket costs were totaled by utilization type and in aggregate for all recall utilization. RESULTS: Of 1,723,139 patients with a mammography screening that were not diagnosed with breast cancer, 259,028 (15.0%) were recalled. Significant demographic differences were observed between recalled and non-recalled patients. The strongest drivers of patient costs were image-guided biopsy (mean 351among11.8351 among 11.8% utilizing), diagnostic mammography (50; 80.1%), and ultrasound ($58; 65.7%), which accounted for 29.9%, 29.0%, and 27.5% of total recall costs, respectively. For many patients the entire cost of recall utilization was covered by the health plan. Total costs were substantially greater among patients with biopsy; one-third of all patients experienced multiple days of recall utilization. CONCLUSION: After a false positive screening mammography, recalled women incurred both direct medical costs and indirect time costs. The cost burden for women with employer-based insurance was dependent upon the type of utilization and extent of health plan coverage. Additional research and technologies are needed to address the entirety of the recall burden in diverse populations of women

    Changes in Anticoagulant Utilization Among United States Nursing Home Residents With Atrial Fibrillation From 2011 to 2016

    Get PDF
    Background: Nursing home residents with atrial fibrillation are at high risk for ischemic stroke and bleeding events. The most recent national estimate (2004) indicated less than one third of this high-risk population was anticoagulated. Whether direct-acting oral anticoagulant ( DOAC ) use has disseminated into nursing homes and increased anticoagulant use is unknown. Methods and Results: A repeated cross-sectional design was used to estimate the point prevalence of oral anticoagulant use on July 1 and December 31 of calendar years 2011 to 2016 among Medicare fee-for-service beneficiaries with atrial fibrillation residing in long-stay nursing homes. Nursing home residence was determined using Minimum Data Set 3.0 records. Medicare Part D claims for apixaban, dabigatran, edoxaban, rivaroxaban, and warfarin were identified and point prevalence was estimated by determining if the supply from the most recent dispensing covered each point prevalence date. A Cochran-Armitage test was performed for linear trend in prevalence. On December 31, 2011, 42.3% of 33 959 residents (median age: 85; Q1 79, Q3 90) were treated with an oral anticoagulant, of whom 8.6% used DOACs. The proportion receiving treatment increased to 47.8% of 37 787 residents as of December 31, 2016 ( P \u3c 0.01); 48.2% of 18 054 treated residents received DOAC s. Demographic and clinical characteristics of residents using DOAC s and warfarin were similar in 2016. Half of the 8734 DOAC users received standard dosages and most were treated with apixaban (54.4%) or rivaroxaban (35.8%) in 2016. Conclusions: Increases in anticoagulant use among US nursing home residents with atrial fibrillation coincided with declining warfarin use and increasing DOAC use

    Reduction in unplanned hospitalizations associated with a physician focused intervention to reduce potentially inappropriate medication use among older adults: a population-based cohort study

    Get PDF
    BACKGROUND: A multimodal general practitioner-focused intervention in the Local Health Authority (LHA) of Parma, Italy, substantially reduced the prevalence of potentially inappropriate medication (PIM) use among older adults. Our objective was to estimate changes in hospitalization rates associated with the Parma LHA quality improvement initiative that reduced PIM use. METHODS: This population-based longitudinal cohort study was conducted among older residents ( \u3e 65 years) using the Parma LHA administrative healthcare database. Crude and adjusted unplanned hospitalization rates were estimated in 3 periods (pre-intervention: 2005-2008, intervention: 2009-2010, post-intervention: 2011-2014). Multivariable negative binomial models estimated trends in quarterly hospitalization rates among individuals at risk during each period using a piecewise linear spline for time, adjusted for time-dependent and time-fixed covariates. RESULTS: The pre-intervention, intervention, and post-intervention periods included 117,061, 107,347, and 121,871 older adults and had crude hospitalization rates of 146.2 (95% CI: 142.2-150.3), 146.8 (95% CI: 143.6-150.0), and 140.8 (95% CI: 136.9-144.7) per 1000 persons per year, respectively. The adjusted pre-intervention hospitalization rate was declining by 0.7% per quarter (IRR = 0.993; 95% CI: 0.991-0.995). The hospitalization rate declined more than twice as fast during the intervention period (1.8% per quarter, IRR = 0.982; 95% CI: 0.979-0.985) and was nearly constant post-intervention (IRR: 0.999; 95% CI: 0.997-1.001). Contrasting model predictions for the intervention period (Q1 2009 to Q4 2010), the intervention was associated with 1481 avoided hospitalizations. CONCLUSION: In a large population of older adults, a multimodal general practitioner-focused intervention to decrease PIM use was associated with a decline in the unplanned hospitalization rate. Such interventions to reduce high risk medication use among older adults warrant consideration by health systems seeking to improve health outcomes and reduce high-cost acute care utilization

    Dermatologist and Patient Preferences in Choosing Treatments for Moderate to Severe Psoriasis

    Get PDF
    INTRODUCTION: The objective of the study was to determine the relative importance (RI) of treatment attributes psoriasis patients and physicians consider when choosing between biologic therapies based on psoriasis severity. METHODS: A discrete choice experiment (DCE) weighting preference for eight sets of hypothetical treatments for moderate or severe psoriasis was conducted. DCE hypothetical treatments were defined and varied on combinations of efficacy, safety, and dosing attributes [frequency/setting/route of administration (ROA)]. RESULTS: When assuming moderate psoriasis in the patient DCE, ROA (RI 29%) and efficacy (RI 27%) drive treatment choices. When assuming severe disease in the DCE, patients preferred treatments with higher efficacy (RI 36%); ROA was relatively less important (RI 15%). From the physician perspective, ROA (RI 32%) and efficacy (RI 26%) were most important for moderate psoriasis patients. In the physician model for severe psoriasis, efficacy (RI 42%) was the predominant driver followed by ROA (RI 22%). Regardless of severity, probability of loss of response within 1 year was the least important factor. CONCLUSIONS: The severity of disease is a critical element in psoriasis treatment selection. There are high levels of alignment between physician- and patient-derived preferences in biologic treatment choice selection for psoriasis. FUNDING: Janssen Pharmaceuticals

    Evidence of potential bias in a comparison of beta blockers and calcium channel blockers in patients with chronic obstructive pulmonary disease and acute coronary syndrome: results of a multinational study

    Get PDF
    OBJECTIVES: A number of observational studies have reported that, in patients with chronic obstructive pulmonary disease (COPD), beta blockers (BBs) decrease risk of mortality and COPD exacerbations. To address important methodological concerns of these studies, we compared the effectiveness and safety of cardioselective BBs versus non-dihydropyridine calcium channel blockers (non-DHP CCBs) in patients with COPD and acute coronary syndromes (ACS) using a propensity score (PS)-matched, active comparator, new user design. We also assessed for potential unmeasured confounding by examining a short-term COPD hospitalisation outcome. SETTING AND PARTICIPANTS: We identified 22 985 patients with COPD and ACS starting cardioselective BBs or non-DHP CCBs across 5 claims databases from the USA, Italy and Taiwan. PRIMARY AND SECONDARY OUTCOME MEASURES: Stratified Cox regression models were used to estimate HRs for mortality, cardiovascular (CV) hospitalisations and COPD hospitalisations in each database after variable-ratio PS matching. Results were combined with random-effects meta-analyses. RESULTS: Cardioselective BBs were not associated with reduced risk of mortality (HR, 0.90; 95% CI 0.78 to 1.02) or CV hospitalisations (HR, 1.06; 95% CI 0.91 to 1.23), although statistical heterogeneity was observed across databases. In contrast, a consistent, inverse association for COPD hospitalisations was identified across databases (HR, 0.54; 95% CI 0.47 to 0.61), which persisted even within the first 30 days of follow-up (HR, 0.55; 95% CI 0.37 to 0.82). Results were similar across a variety of sensitivity analyses, including PS trimming, high dimensional-PS matching and restricting to high-risk patients. CONCLUSIONS: This multinational study found a large inverse association between cardioselective BBs and short-term COPD hospitalisations. The persistence of this bias despite state-of-the-art pharmacoepidemiologic methods calls into question the ability of claims data to address confounding in studies of BBs in patients with COPD

    Angiotensin blockade therapy and survival in pancreatic cancer: a population study

    Get PDF
    Background: Pancreatic cancer (PC) is one of the most aggressive and challenging cancer types to effectively treat, ranking as the fourth-leading cause of cancer death in the United States. We investigated if exposures to angiotensin II receptor blockers (ARBs) or angiotensin I converting enzyme (ACE) inhibitors after PC diagnosis are associated with survival. Methods: PC patients were identified by ICD-9 diagnosis and procedure codes among the 3.7 million adults living in the Emilia-Romagna Region from their administrative health care database containing patient data on demographics, hospital discharges, all-cause mortality, and outpatient pharmacy prescriptions. Cox modeling estimated covariate-adjusted mortality hazard ratios for time-dependent ARB and ACE inhibitor exposures after PC diagnosis. Results: 8,158 incident PC patients were identified between 2003 and 2011, among whom 20% had pancreas resection surgery, 36% were diagnosed with metastatic disease, and 7,027 (86%) died by December 2012. Compared to otherwise similar patients, those exposed to ARBs after PC diagnosis experienced 20% lower mortality risk (HR=0.80; 95% CI: 0.72, 0.89). Those exposed to ACE inhibitors during the first three years of survival after PC diagnosis experienced 13% lower mortality risk (HR=0.87; 95% CI: 0.80, 0.94) which attenuated after surviving three years (HR=1.14; 95% CI: 0.90, 1.45). Conclusions: The results of this large population study suggest that exposures to ARBs and ACE inhibitors after PC diagnosis are significantly associated with improved survival. ARBs and ACE inhibitors could be important considerations for treating PC patients, particularly those with the worst prognosis and most limited treatment options. Considering that these common FDA approved drugs are inexpensive to payers and present minimal increased risk of adverse events to patients, there is an urgent need for randomized clinical trials, large simple randomized trials, or pragmatic clinical trials to formally and broadly evaluate the effects of ARBs and ACE inhibitors on survival in PC patients
    corecore