50 research outputs found

    Thromboembolism and Mortality in the Tasmanian Atrial Fibrillation Study

    No full text
    © 2018, The Author(s) 2018. Background: Although utilization of anticoagulation in patients with atrial fibrillation (AF) has increased in recent years, contemporary data regarding thromboembolism and mortality incidence rates are limited outside of clinical trials. This study aimed to investigate the impact of the direct oral anticoagulants (DOACs) on the clinical outcomes of patients with AF included in the Tasmanian Atrial Fibrillation Study. Methods: The medical records of all patients with a primary or secondary diagnosis of AF who presented to public hospitals in Tasmania, Australia, between 2011 and 2015, were retrospectively reviewed. We investigated overall thromboembolic events (TEs), ischemic stroke/transient ischemic attack (IS/TIA), and mortality incidence rates in patients admitted to the Royal Hobart Hospital, the main teaching hospital in the state. We compared outcomes in 2 time periods: prior to the availability of DOACs (pre-DOAC; 2011 to mid-2013) and following their general availability after government subsidization (post-DOAC; mid-2013 to 2015). Results: Of the 2390 patients with AF admitted during the overall study period, 942 patients newly prescribed an antithrombotic medication (465 and 477 from the pre-DOAC and post-DOAC time periods, respectively) were followed. We observed a significant decrease in the incidence rates of overall TE (3.2 vs 1.7 per 100 patient-years [PY] ; P < .001) and IS/TIA (2.1 vs 1.3 per 100 PY; P =.022) in the post-DOAC compared to the pre-DOAC period. All-cause mortality was significantly lower in the post-DOAC period (2.9 vs 2.2 per 100 PY, P =.028). Increasing age, prior stroke, and admission in the pre-DOAC era were all risk factors for TE, IS/TIA, and mortality in this study population. The risk of IS/TIA was more than doubled (hazard ratio: 2.54; 95% confidence interval: 1.17-5.52) in current smokers compared to ex- and nonsmokers. Conclusion: Thromboembolic event and all-cause mortality rates were lower following the widespread availability of DOACs in this population

    Changes in Oral Anticoagulant Prescribing for Stroke Prevention in Patients With Atrial Fibrillation

    No full text
    © 2017 Elsevier Inc. Suboptimal guideline adherence and underuse of anticoagulants in patients with atrial fibrillation (AF) have been reported worldwide. This study aimed to compare anticoagulation practice in Australia during the pre- and postdirect oral anticoagulant (DOAC) eras. Between January 2011 and July 2015, patients with nonvalvular AF (NVAF) admitted to the Royal Hobart Hospital, Tasmania, Australia, were retrospectively reviewed. The pre- and post-DOAC era cohorts included admissions from January 2011 to July 2013 and August 2013 to July 2015, respectively. Overall, 2,118 patients met our inclusion criteria. The overall rate of anticoagulation increased from 52.5% in the pre-DOAC era to 60.7% in the post-DOAC era (p < 0.001). Moreover, prescribing of OACs among high-risk patients improved significantly (63.1% vs 55.2%, p = 0.001). OAC overprescribing in low-risk patients did not change significantly between the 2 cohorts (35.0% vs 42.9% in the pre- and post-DOAC eras, respectively, p = 0.59). In multivariate analysis, DOAC era (odds ratio [OR] 1.40, 95% confidence interval [CI] 1.17 to 1.68 and CHA 2 DS 2 -VASc =2 (OR 1.95, 95% CI 1.36 to 2.80) were independent predictors of OAC prescribing in both eras and the whole study period. Conversely, aging and previous bleeding were inversely associated with OAC prescribing. In conclusion, there has been a significant increase in OAC prescribing in the post-DOAC era, potentially driven by the widespread availability of DOACs. However, OAC underuse in high-risk patients and overuse in low-risk patients was apparent throughout our study. These findings highlight the need to identify the drivers of anticoagulant underuse and overuse and address them accordingly

    Stochastic Dominance Analysis of Soil and Water Conservation in Subsistence Crop Production in the Eastern Ethiopian Highlands: The Case of the Hunde-Lafto Area

    No full text
    The aim of this paper is to analyze whether investment in soil and water conservation results in a higher yield and income and/or mitigate variability in yield and income to subsistence farm households in the Hunde-Lafto area. Net returns from crop production with and without soil and water conservation (SWC) are compared based on stochastic dominance (SD) criteria. A non-parametric first order SD and normalized second order are used for data analysis. Analysis is based on the Soil Conservation Research Program (SCRP) database for the Hunde-Lafto research unit. The results of the analysis suggest that adopting a conservation strategy results in higher grain yield and net return than in not adopting. The normalized second order SD analysis results do not support the hypothesis that conservation strategy is unambiguously better than a noconservation strategy in reducing variability in yield and net return to farmers. However, conservation strategy has shown second order dominance at lower levels of yield and income that often correspond to unfavorable rainfall conditions. This makes it a preferred strategy to cope with the most prevalent risk factor of moisture shortage. Therefore, appropriate policies to help and encourage farmers to adopt SWC structures will contribute to improving the welfare of subsistence farm households in the study area and in other similar settings in the country. Designing and implementing SWC techniques that may result in unambiguous second order SD dominance will further improve the desirability and adoption of conservation measures. Copyright Springer 2005erosion, Ethiopia, net return, soil and water conservation, stochastic dominance, Q24, renewable resources and conservation,
    corecore