48 research outputs found

    Real-world comparison of bleeding risks among non-valvular atrial fibrillation patients prescribed apixaban, dabigatran, or rivaroxaban

    Get PDF
    Limited real-world data are available regarding the comparative safety of non-vitamin K antagonist oral anticoagulants (NOACs). The objective of this retrospective claims observational cohort study was to compare the risk of bleeding among non-valvular atrial fibrillation (NVAF) patients prescribed apixaban, dabigatran, or rivaroxaban. NVAF patients aged ≄18 years with a 1-year baseline period were included if they were new initiators of NOACs or switched from warfarin to a NOAC. Cox proportional hazards modelling was used to estimate the adjusted hazard ratios of any bleeding, clinically relevant non-major (CRNM) bleeding, and major inpatient bleeding within 6 months of treatment initiation for rivaroxaban and dabigatran compared to apixaban. Among 60,227 eligible patients, 8,785 were prescribed apixaban, 20,963 dabigatran, and 30,529 rivaroxaban. Compared to dabigatran or rivaroxaban patients, apixaban patients were more likely to have greater proportions of baseline comorbidities and higher CHA2DS2-VASc and HAS-BLED scores. After adjusting for baseline clinical and demographic characteristics, patients prescribed rivaroxaban were more likely to experience any bleeding (HR: 1.35, 95% confidence interval [CI]: 1.26-1.45), CRNM bleeding (HR: 1.38, 95% CI: 1.27-1.49), and major inpatient bleeding (HR: 1.43, 95% CI: 1.17-1.74), compared to patients prescribed apixaban. Dabigatran patients had similar bleeding risks as apixaban patients. In conclusion, NVAF patients treated with rivaroxaban appeared to have an increased risk of any bleeding, CRNM bleeding, and major inpatient bleeding, compared to apixaban patients. There was no significant difference in any bleeding, CRNM bleeding, or inpatient major bleeding risks between patients treated with dabigatran and apixaban

    Haploidentical vs. sibling, unrelated, or cord blood hematopoietic cell transplantation for acute lymphoblastic leukemia

    Get PDF
    The role of haploidentical hematopoietic cell transplantation (HCT) using posttransplant cyclophosphamide (PTCy) for acute lymphoblastic leukemia (ALL) is being defined. We performed a retrospective, multivariable analysis comparing outcomes of HCT approaches by donor for adults with ALL in remission. The primary objective was to compare overall survival (OS) among haploidentical HCTs using PTCy and HLA-matched sibling donor (MSD), 8/8 HLAmatched unrelated donor (MUD), 7 /8 HLA-MUD, or umbilical cord blood (UCB) HCT. Comparing haploidentical HCT to MSD HCT, we found that OS, leukemia-free survival (LFS), nonrelapse mortality (NRM), relapse, and acute graft-versus-host disease (aGVHD) were not different but chronic GVHD (cGVHD) was higher in MSD HCT. Compared with MUD HCT, OS, LFS, and relapse were not different, but MUD HCT had increased NRM (hazard ratio [HR], 1.42; P = .02), grade 3 to 4 aGVHD (HR, 1.59; P = .005), and cGVHD. Compared with 7/8 UD HCT, LFS and relapse were not different, but 7/8 UD HCT had worse OS (HR, 1.38; P = .01) and increased NRM (HR, 2.13; P <_ .001), grade 3 to 4 aGVHD (HR, 1.86; P = .003), and cGVHD (HR, 1.72; P <_ .001). Compared with UCB HCT, late OS, late LFS, relapse, and cGVHD were not different but UCB HCT had worse early OS (<_18 months; HR, 1.93; P < .001), worse early LFS (HR, 1.40; P = .007) and increased incidences of NRM (HR, 2.08; P < .001) and grade 3 to 4 aGVHD (HR, 1.97; P < .001). Haploidentical HCT using PTCy showed no difference in survival but less GVHD compared with traditional MSD and MUD HCT and is the preferred alternative donor HCT option for adults with ALL in complete remission

    Risk Factors for Graft-versus-Host Disease in Haploidentical Hematopoietic Cell Transplantation Using Post-Transplant Cyclophosphamide

    Get PDF
    Post-transplant cyclophosphamide (PTCy) has significantly increased the successful use of haploidentical donors with a relatively low incidence of graft-versus-host disease (GVHD). Given its increasing use, we sought to determine risk factors for GVHD after haploidentical hematopoietic cell transplantation (haplo-HCT) using PTCy. Data from the Center for International Blood and Marrow Transplant Research on adult patients with acute myeloid leukemia, acute lymphoblastic leukemia, myelodysplastic syndrome, or chronic myeloid leukemia who underwent PTCy-based haplo-HCT (2013 to 2016) were analyzed and categorized into 4 groups based on myeloablative (MA) or reduced-intensity conditioning (RIC) and bone marrow (BM) or peripheral blood (PB) graft source. In total, 646 patients were identified (MA-BM = 79, MA-PB = 183, RIC-BM = 192, RIC-PB = 192). The incidence of grade 2 to 4 acute GVHD at 6 months was highest in MA-PB (44%), followed by RIC-PB (36%), MA-BM (36%), and RIC-BM (30%) (P = .002). The incidence of chronic GVHD at 1 year was 40%, 34%, 24%, and 20%, respectively (P < .001). In multivariable analysis, there was no impact of stem cell source or conditioning regimen on grade 2 to 4 acute GVHD; however, older donor age (30 to 49 versus <29 years) was significantly associated with higher rates of grade 2 to 4 acute GVHD (hazard ratio [HR], 1.53; 95% confidence interval [CI], 1.11 to 2.12; P = .01). In contrast, PB compared to BM as a stem cell source was a significant risk factor for the development of chronic GVHD (HR, 1.70; 95% CI, 1.11 to 2.62; P = .01) in the RIC setting. There were no differences in relapse or overall survival between groups. Donor age and graft source are risk factors for acute and chronic GVHD, respectively, after PTCy-based haplo-HCT. Our results indicate that in RIC haplo-HCT, the risk of chronic GVHD is higher with PB stem cells, without any difference in relapse or overall survival

    Synthesis and Preliminary Evaluation of Some Substituted Pyrazoles as Anticonvulsant Agents

    No full text
    ABSTRACT: A series of 3-[3-(substituted phenyl)-1-isonicotinoyl-1H-pyrazol-5-yl]-2H-chromen-2-one derivatives (4a-k) were synthesized using appropriate synthetic route and characterized by spectral data. The anticonvulsant activity of the synthesized compounds were evaluated against seizure induced by strychnine in mice. All the test compounds were administered at doses of 100 mg/kg body weight. Three compounds of the series, 4b, 4j and 4k exhibited significant anticonvulsant activity comparable to standard drug, phenytoin

    Effect of transportation on the quality of the donor corneal buttons

    No full text
    The aim of this study is to see the effect of transportation on corneoscleral buttons and to correlate them with the weight of ice used during transportation. This is a prospective cross-sectional study. A total of 119 corneas were graded using slit-lamp examination and specular microscopy before and after transportation. Groups were made based on the weight of ice used during transportation and the number of hours of storage in McCarey-Kaufman (MK) media. The overall median percentage of endothelial cell loss (ECL) was 11.33%. The median percentage of ECL was significantly more with <500 g of ice as compared to ≄500 g (13.6% vs. 7.56%) with P = 0.006. There was no significant difference in percentage of ECL on storage in MK media for ≀48 h, ≀72 h, and ≀96 h. Corneal buttons have ECL on storage and transportation which can be minimized using the appropriate weight of ice during transportation

    Delineation of mechanistic approaches of rhizosphere microorganisms facilitated plant health and resilience under challenging conditions

    Get PDF
    Sustainable agriculture demands the balanced use of inorganic, organic, and microbial biofertilizers for enhanced plant productivity and soil fertility. Plant growth-enhancing rhizospheric bacteria can be an excellent biotechnological tool to augment plant productivity in different agricultural setups. We present an overview of microbial mechanisms which directly or indirectly contribute to plant growth, health, and development under highly variable environmental conditions. The rhizosphere microbiomes promote plant growth, suppress pathogens and nematodes, prime plants immunity, and alleviate abiotic stress. The prospective of beneficial rhizobacteria to facilitate plant growth is of primary importance, particularly under abiotic and biotic stresses. Such microbe can promote plant health, tolerate stress, even remediate soil pollutants, and suppress phytopathogens. Providing extra facts and a superior understanding of microbial traits underlying plant growth promotion can stir the development of microbial-based innovative solutions for the betterment of agriculture. Furthermore, the application of novel scientific approaches for facilitating the design of crop-specific microbial biofertilizers is discussed. In this context, we have highlighted the exercise of "multi-omics" methods for assessing the microbiome's impact on plant growth, health, and overall fitness via analyzing biochemical, physiological, and molecular facets. Furthermore, the role of clustered regularly interspaced short palindromic repeats (CRISPR) based genome alteration and nanotechnology for improving the agronomic performance and rhizosphere microbiome is also briefed. In a nutshell, the paper summarizes the recent vital molecular processes that underlie the different beneficial plant-microbe interactions imperative for enhancing plant fitness and resilience under-challenged agriculture

    Major bleeding risk among non‐valvular atrial fibrillation patients initiated on apixaban, dabigatran, rivaroxaban or warfarin: a “real‐world” observational study in the United States

    No full text
    BACKGROUND: Limited data are available about the real-world safety of non-vitamin K antagonist oral anticoagulants (NOACs).OBJECTIVES: To compare the major bleeding risk among newly anticoagulated non-valvular atrial fibrillation (NVAF) patients initiating apixaban, warfarin, dabigatran or rivaroxaban in the United States.METHODS AND RESULTS: A retrospective cohort study was conducted to compare the major bleeding risk among newly anticoagulated NVAF patients initiating warfarin, apixaban, dabigatran or rivaroxaban. The study used the Truven MarketScan(Âź) Commercial &amp; Medicare supplemental US database from 1 January 2013 through 31 December 2013. Major bleeding was defined as bleeding requiring hospitalisation. Cox model estimated hazard ratios (HRs) of major bleeding were adjusted for age, gender, baseline comorbidities and co-medications. Among 29 338 newly anticoagulated NVAF patients, 2402 (8.19%) were on apixaban; 4173 (14.22%) on dabigatran; 10 050 (34.26%) on rivaroxaban; and 12 713 (43.33%) on warfarin. After adjusting for baseline characteristics, initiation on warfarin [adjusted HR (aHR): 1.93, 95% confidence interval (CI): 1.12-3.33, P=.018] or rivaroxaban (aHR: 2.19, 95% CI: 1.26-3.79, P=.005) had significantly greater risk of major bleeding vs apixaban. Dabigatran initiation (aHR: 1.71, 95% CI: 0.94-3.10, P=.079) had a non-significant major bleeding risk vs apixaban. When compared with warfarin, apixaban (aHR: 0.52, 95% CI: 0.30-0.89, P=.018) had significantly lower major bleeding risk. Patients initiating rivaroxaban (aHR: 1.13, 95% CI: 0.91-1.41, P=.262) or dabigatran (aHR: 0.88, 95% CI: 0.64-1.21, P=.446) had a non-significant major bleeding risk vs warfarin.CONCLUSION: Among newly anticoagulated NVAF patients in the real-world setting, initiation with rivaroxaban or warfarin was associated with a significantly greater risk of major bleeding compared with initiation on apixaban. When compared with warfarin, initiation with apixaban was associated with significantly lower risk of major bleeding. Additional observational studies are required to confirm these findings.</p

    Major bleeding risk among non-valvular atrial fibrillation patients initiated on apixaban, dabigatran, rivaroxaban or warfarin:a "real-world" observational study in the United States

    No full text
    BACKGROUND: Limited data are available about the real‐world safety of non‐vitamin K antagonist oral anticoagulants (NOACs). OBJECTIVES: To compare the major bleeding risk among newly anticoagulated non‐valvular atrial fibrillation (NVAF) patients initiating apixaban, warfarin, dabigatran or rivaroxaban in the United States. METHODS AND RESULTS: A retrospective cohort study was conducted to compare the major bleeding risk among newly anticoagulated NVAF patients initiating warfarin, apixaban, dabigatran or rivaroxaban. The study used the Truven MarketScan(¼) Commercial & Medicare supplemental US database from 1 January 2013 through 31 December 2013. Major bleeding was defined as bleeding requiring hospitalisation. Cox model estimated hazard ratios (HRs) of major bleeding were adjusted for age, gender, baseline comorbidities and co‐medications. Among 29 338 newly anticoagulated NVAF patients, 2402 (8.19%) were on apixaban; 4173 (14.22%) on dabigatran; 10 050 (34.26%) on rivaroxaban; and 12 713 (43.33%) on warfarin. After adjusting for baseline characteristics, initiation on warfarin [adjusted HR (aHR): 1.93, 95% confidence interval (CI): 1.12–3.33, P=.018] or rivaroxaban (aHR: 2.19, 95% CI: 1.26–3.79, P=.005) had significantly greater risk of major bleeding vs apixaban. Dabigatran initiation (aHR: 1.71, 95% CI: 0.94–3.10, P=.079) had a non‐significant major bleeding risk vs apixaban. When compared with warfarin, apixaban (aHR: 0.52, 95% CI: 0.30–0.89, P=.018) had significantly lower major bleeding risk. Patients initiating rivaroxaban (aHR: 1.13, 95% CI: 0.91–1.41, P=.262) or dabigatran (aHR: 0.88, 95% CI: 0.64–1.21, P=.446) had a non‐significant major bleeding risk vs warfarin. CONCLUSION: Among newly anticoagulated NVAF patients in the real‐world setting, initiation with rivaroxaban or warfarin was associated with a significantly greater risk of major bleeding compared with initiation on apixaban. When compared with warfarin, initiation with apixaban was associated with significantly lower risk of major bleeding. Additional observational studies are required to confirm these findings

    “I Wasted 3 Years, Thinking It’s Not a Problem”: Patient and Health System Delays in Diagnosis of Leprosy in India: A Mixed-Methods Study

    No full text
    <div><p>Background</p><p>Worldwide, leprosy is one of the major causes of preventable disability. India contributes to 60% of global leprosy burden. With increasing numbers of leprosy with grade 2 disability (visible disability) at diagnosis, we aimed to determine risk factors associated with grade 2 disability among new cases and explore patients and providers’ perspectives into reasons for late presentation.</p><p>Methodology/Principal Findings</p><p>This was an explanatory mixed-methods study where the quantitative component, a matched case-control design, was followed by a qualitative component. A total of 70 cases (grade 2 disability) and 140 controls (grade 0) matched for age and sex were randomly sampled from new patients registered between January 2013-January 2015 in three districts of Maharashtra (Mumbai, Thane and Amaravati) and interviewed using a structured close ended questionnaire. Eight public health care providers involved in leprosy care and 7 leprosy patients were purposively selected (maximum variation sampling) and interviewed using a structured open-ended interview schedule. Among cases, overall median (IQR) diagnosis delay in months was 17.9(7–30); patient and health system delay was 7(4–16.5) and 5.5(0.9–12.5) respectively; this was significantly higher than the delay in controls. Reasons for delayed presentation identified by the quantitative and qualitative data were: poor awareness of leprosy symptoms, first health care provider visited being private practitioners who were not aware about provision of free leprosy treatment at public health care facilities, reduced engagement and capacity of the general health care system in leprosy control.</p><p>Conclusions</p><p>Raising awareness in communities and health care providers regarding early leprosy symptoms, engagement of private health care provider in early leprosy diagnosis and increasing capacity of general health system staff, especially targeting high endemic areas that are hotspots for leprosy transmission may help in reducing diagnosis delays.</p></div
    corecore