283 research outputs found

    Stroke impact on mortality and psychologic morbidity within the Childhood Cancer Survivor Study.

    Get PDF
    BackgroundPoor socioeconomic and health-related quality of life (HRQOL) outcomes in survivors of childhood cancer can lead to distress and overall negatively impact the lives of these individuals. The current report has highlighted the impact of stroke and stroke recurrence on mortality, psychological HRQOL, and socioeconomic outcomes within the Childhood Cancer Survivor Study (CCSS).MethodsThe CCSS is a retrospective cohort study with longitudinal follow-up concerning survivors of pediatric cancer who were diagnosed between 1970 and 1986. Mortality rates per 100 person-years were calculated across 3 periods: 1) prior to stroke; 2) after first stroke and before recurrent stroke; and 3) after recurrent stroke. Socioeconomic outcomes, the standardized Brief Symptoms Inventory-18, the Medical Outcomes Study 36-Item Short Form Health Survey, and the CCSS-Neurocognitive Questionnaire also were assessed.ResultsAmong 14,358 participants (median age, 39.7 years), 224 had a stroke after their cancer diagnosis (single stroke in 161 patients and recurrent stroke in 63 patients). Based on 2636 deaths, all-cause late mortality rates were 0.70 (95% CI, 0.68-0.73) prior to stroke, 1.03 (95% CI, 0.73-1.46) after the first stroke, and 2.42 (95% CI, 1.48-3.94) after the recurrent stroke. Among 7304 survivors, those with stroke were more likely to live with a caregiver (single stroke odds ratio [OR], 2.3 [95% CI, 1.4-3.8]; and recurrent stroke OR, 5.3 [95% CI, 1.7-16.8]) compared with stroke-free survivors. Stroke negatively impacted task efficiency (single stroke OR, 2.4 [95% CI, 1.4-4.1] and recurrent stroke OR, 3.3 [95% CI, 1.1-10.3]) and memory (single stroke OR, 2.1 [95% CI, 1.2-3.7]; and recurrent stroke OR, 3.5 [95% CI, 1.1-10.5]).ConclusionsStroke and stroke recurrence are associated with increased mortality and negatively impact HRQOL measures in survivors of pediatric cancer

    Importance of low-relief nursery habitat for reef fishes

    Get PDF
    Coastal restoration projects to mitigate environmental impacts have increased global demand for sand resources. Unfortunately, these resources are often extracted from sand/shell banks on the inner continental shelf, resulting in significant alteration or loss of low-relief reefs in coastal oceans. Experimental reefs (oyster shell, limestone rubble, composite) were deployed in the western Gulf of Mexico to assess their potential value as nurseries for newly settled reef fishes. Occurrence, abundance, and species richness of juvenile fishes were significantly higher on all three types of low-relief reefs compared with unconsolidated sediment. Moreover, reefs served as nursery habitat for a range of reef fish taxa (angelfishes, grunts, sea basses, snappers, and triggerfishes). Red snapper (Lutjanus campechanus) was the dominant species present on all experimental reefs (100% occurrence), and mean density of this species was markedly higher on each of the three low-relief reefs (\u3e40.0 individuals/reef) relative to comparable areas over unconsolidated sediment (0.2 individuals). Our results suggest creation or restoration of structurally complex habitat on the inner shelf has the potential to markedly increase early life survival and expedite the recovery of exploited reef fish populations, and therefore may represent a critical conservation tool for increasing recruitment and maintaining reef fish diversity

    Periprocedural bridging anticoagulation in patients with venous thromboembolism: A registry- based cohort study

    Full text link
    BackgroundUse of bridging anticoagulation increases a patient’s bleeding risk without clear evidence of thrombotic prevention among warfarin- treated patients with atrial fibrillation. Contemporary use of bridging anticoagulation among warfarin- treated patients with venous thromboembolism (VTE) has not been studied.MethodsWe identified warfarin- treated patients with VTE who temporarily stopped warfarin for a surgical procedure between 2010 and 2018 at six health systems. Using the 2012 American College of Chest Physicians guideline, we assessed use of periprocedural bridging anticoagulation based on recurrent VTE risk. Recurrent VTE risk and 30- day outcomes (bleeding, thromboembolism, emergency department visit) were each assessed using logistic regression adjusted for multiple procedures per patient.ResultsDuring the study period, 789 warfarin- treated patients with VTE underwent 1529 procedures (median, 2; interquartile range, 1- 4). Unadjusted use of bridging anticoagulation was more common in patients at high risk for VTE recurrence (99/171, 57.9%) than for patients at moderate (515/1078, 47.8%) or low risk of recurrence (134/280, 47.86%). Bridging anticoagulation use was higher in high- risk patients compared with low- or moderate- risk patients in both unadjusted (P = .013) and patient- level cluster- adjusted analyses (P = .031). Adherence to American College of Chest Physicians guidelines in high- and low- risk patients did not change during the study period (odds ratio, 0.98 per year; 95% confidence interval, 0.91- 1.05). Adverse events were rare and not statistically different between the two treatment groups.ConclusionsBridging anticoagulation was commonly overused among low- risk patients and underused among high- risk patients treated with warfarin for VTE. Adverse events were rare and not different between the two treatment groups.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/156139/2/jth14903_am.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/156139/1/jth14903.pd

    Are people with chronic diseases interested in using telehealth?: a cross-sectional postal survey

    Get PDF
    Background There is growing interest in telehealth—the use of technology to support the remote delivery of health care and promote self-management—as a potential alternative to face-to-face care for patients with chronic diseases. However, little is known about what precipitates interest in the use of telehealth among these patients. Objective This survey forms part of a research program to develop and evaluate a telehealth intervention for patients with two exemplar chronic diseases: depression and raised cardiovascular disease (CVD) risk. The survey was designed to explore the key factors that influence interest in using telehealth in these patient groups. Methods Thirty-four general practices were recruited from two different regions within England. Practice records were searched for patients with (1) depression (aged 18+ years) or (2) 10-year risk of CVD ≥20% and at least one modifiable risk factor (aged 40-74 years). Within each general practice, 54 patients in each chronic disease group were randomly selected to receive a postal questionnaire. Questions assessed five key constructs: sociodemographics, health needs, difficulties accessing health care, technology-related factors (availability, confidence using technology, perceived benefits and drawbacks of telehealth), and satisfaction with prior use of telehealth. Respondents also rated their interest in using different technologies for telehealth (phone, email and Internet, or social media). Relationships between the key constructs and interest in using the three mediums of telehealth were examined using multivariable regression models. Results Of the 3329 patients who were sent a study questionnaire, 44.40% completed it (872/1740, 50.11% CVD risk; 606/1589, 38.14% depression). Overall, there was moderate interest in using phone-based (854/1423, 60.01%) and email/Internet-based (816/1425, 57.26%) telehealth, but very little interest in social media (243/1430, 16.99%). After adjusting for health needs, access difficulties, technology-related factors, and prior use of telehealth, interest in telehealth had largely no association with sociodemographic variables. For both patient groups and for each of the three technology mediums, the most important constructs related to interest in telehealth were having the confidence to use the associated technology, as well as perceiving greater advantages and fewer disadvantages from using telehealth. To illustrate, greater confidence using phone technologies (b=.16, 95% CI 0.002-0.33), while also perceiving more benefits (b=.31, 95% CI 0.21-0.40) and fewer drawbacks (b=-.23, 95% CI -0.28 to -0.17) to using telehealth were associated with more interest in using phone-based telehealth technologies for patients with depression. Conclusions There is widespread interest in using phone-based and email/Internet-based telehealth among patients with chronic diseases, regardless of their health status, access difficulties, age, or many other sociodemographic factors. This interest could be increased by helping patients gain confidence using technologies and through highlighting benefits and addressing concerns about telehealth. While the same pattern exists for social media telehealth, interest in using these technologies is minimal

    Assessment of an Intervention to Reduce Aspirin Prescribing for Patients Receiving Warfarin for Anticoagulation

    Get PDF
    Importance: For some patients receiving warfarin, adding aspirin (acetylsalicylic acid) increases bleeding risk with unclear treatment benefit. Reducing excess aspirin use could be associated with improved clinical outcomes. Objective: To assess changes in aspirin use, bleeding, and thrombosis event rates among patients treated with warfarin. Design, Setting, and Participants: This pre-post observational quality improvement study was conducted from January 1, 2010, to December 31, 2019, at a 6-center quality improvement collaborative in Michigan among 6738 adults taking warfarin for atrial fibrillation and/or venous thromboembolism without an apparent indication for concomitant aspirin. Statistical analysis was conducted from November 26, 2020, to June 14, 2021. Intervention: Primary care professionals for patients taking aspirin were asked whether an ongoing combination aspirin and warfarin treatment was indicated. If not, then aspirin was discontinued with the approval of the managing clinician. Main Outcomes and Measures: Outcomes were assessed before and after intervention for the primary analysis and before and after 24 months before the intervention (when rates of aspirin use first began to decrease) for the secondary analysis. Outcomes included the rate of aspirin use, bleeding, and thrombotic outcomes. An interrupted time series analysis assessed cumulative monthly event rates over time. Results: A total of 6738 patients treated with warfarin (3160 men [46.9%]; mean [SD] age, 62.8 [16.2] years) were followed up for a median of 6.7 months (IQR, 3.2-19.3 months). Aspirin use decreased slightly from a baseline mean use of 29.4% (95% CI, 28.9%-29.9%) to 27.1% (95% CI, 26.1%-28.0%) during the 24 months before the intervention (P \u3c .001 for slope before and after 24 months before the intervention) with an accelerated decrease after the intervention (mean aspirin use, 15.7%; 95% CI, 14.8%-16.8%; P = .001 for slope before and after intervention). In the primary analysis, the intervention was associated with a significant decrease in major bleeding events per month (preintervention, 0.31%; 95% CI, 0.27%-0.34%; postintervention, 0.21%; 95% CI, 0.14%-0.28%; P = .03 for difference in slope before and after intervention). No change was observed in mean percentage of patients having a thrombotic event from before to after the intervention (0.21% vs 0.24%; P = .34 for difference in slope). In the secondary analysis, reducing aspirin use (starting 24 months before the intervention) was associated with decreases in mean percentage of patients having any bleeding event (2.3% vs 1.5%; P = .02 for change in slope before and after 24 months before the intervention), mean percentage of patients having a major bleeding event (0.31% vs 0.25%; P = .001 for change in slope before and after 24 months before the intervention), and mean percentage of patients with an emergency department visit for bleeding (0.99% vs 0.67%; P = .04 for change in slope before and after 24 months before the intervention), with no change in mean percentage of patients with a thrombotic event (0.20% vs 0.23%; P = .36 for change in slope before and after 24 months before the intervention). Conclusions and Relevance: This quality improvement intervention was associated with an acceleration of a preexisting decrease in aspirin use among patients taking warfarin for atrial fibrillation and/or venous thromboembolism without a clear indication for aspirin therapy. Reductions in aspirin use were associated with reduced bleeding. This study suggests that an anticoagulation clinic-based aspirin deimplementation intervention can improve guideline-concordant aspirin use

    Accounting students' IT applicaton skills over a 10-year period

    Get PDF
    This paper reports on the changing nature of a range of information technology (IT) application skills that students declare on entering an accounting degree over the period from 1996 to 2006. Accounting educators need to be aware of the IT skills students bring with them to university because of the implications this has for learning and teaching within the discipline and the importance of both general and specific IT skills within the practice and craft of accounting. Additionally, IT skills constitute a significant element within the portfolio of employability skills that are increasingly demanded by employers and emphasized within the overall Higher Education (HE) agenda. The analysis of students' reported IT application skills on entry to university, across a range of the most relevant areas of IT use in accounting, suggest that their skills have continued to improve over time. However, there are significant differential patterns of change through the years and within cohorts. The paper addresses the generalizability of these findings and discusses the implications of these factors for accounting educators, including the importance of recognising the differences that are potentially masked by the general increase in skills; the need for further research into the changing nature, and implications, of the gender gap in entrants' IT application skills; and the low levels of entrants' spreadsheet and database skills that are a cause for concern

    Validation of a Persian version of the OIDP index

    Get PDF
    BACKGROUND: Measuring the impacts of oral conditions on quality of life is an important part of oral health needs assessment. For this purpose a variety of oral health-related quality of life instruments have been developed. To use a scale in a new context or with a different groups of people, it is necessary to re-establish its psychometric properties. The objectives of this study are to develop and test the reliability and validity of the Persian version of Oral Impacts on Daily Performances (OIDP) index. METHODS: The Persian version of OIDP index was developed through a linguistic translation exercise. The psychometric properties of the Persian version of OIDP were evaluated in terms of face, content, construct and criterion validity in addition to internal and test-retest reliability. A convenience sample of 285 working adults aged 20–50 living in Mashad was recruited (91% response rate) to evaluate the Persian version. RESULTS: The Persian version of OIDP had excellent validity and reliability charactersitics. Weighted Kappa was 0.91. Cronbachs alpha coefficient was 0.79. The index showed significant associations with self-rated oral and general health status, as well as perceived dental treatment needs, satisfaction with mouth and prevalence of pain in mouth (P < 0.001). 64.9% of subjects had an oral impact on their daily performances. The most prevalent performance affected was eating, followed by major work or role and sleeping. CONCLUSION: The Persian version of OIDP index is a valid and reliable measure for use in 20 to 50 year old working Iranians

    Developing a Machine Learning-Based Clinical Decision Support Tool for Uterine Tumor Imaging

    Full text link
    Uterine leiomyosarcoma (LMS) is a rare but aggressive malignancy. On imaging, it is difficult to differentiate LMS from, for example, degenerated leiomyoma (LM), a prevalent but benign condition. We curated a data set of 115 axial T2-weighted MRI images from 110 patients (mean [range] age=45 [17-81] years) with UTs that included five different tumor types. These data were randomly split stratifying on tumor volume into training (n=85) and test sets (n=30). An independent second reader (reader 2) provided manual segmentations for all test set images. To automate segmentation, we applied nnU-Net and explored the effect of training set size on performance by randomly generating subsets with 25, 45, 65 and 85 training set images. We evaluated the ability of radiomic features to distinguish between types of UT individually and when combined through feature selection and machine learning. Using the entire training set the mean [95% CI] fibroid DSC was measured as 0.87 [0.59-1.00] and the agreement between the two readers was 0.89 [0.77-1.0] on the test set. When classifying degenerated LM from LMS we achieve a test set F1-score of 0.80. Classifying UTs based on radiomic features we identify classifiers achieving F1-scores of 0.53 [0.45, 0.61] and 0.80 [0.80, 0.80] on the test set for the benign versus malignant, and degenerated LM versus LMS tasks. We show that it is possible to develop an automated method for 3D segmentation of the uterus and UT that is close to human-level performance with fewer than 150 annotated images. For distinguishing UT types, while we train models that merit further investigation with additional data, reliable automatic differentiation of UTs remains a challenge

    Outcomes of Direct Oral Anticoagulants with Aspirin Versus Warfarin with Aspirin for Atrial Fibrillation and/or Venous Thromboembolic Disease

    Get PDF
    Introduction: The direct oral anticoagulants (DOACs) including apixaban, dabigatran, edoxaban, and rivaroxaban are increasingly utilized for the management of venous thromboembolic disease (VTE) and/or non-valvular atrial fibrillation (NVAF). Adding aspirin (ASA) to warfarin or DOAC therapy increases bleeding risk. Patients on combination therapy with ASA and an anticoagulant were not well represented in clinical trials comparing DOACs to warfarin. We sought to compare bleeding and thrombotic outcomes with DOACs and ASA compared to warfarin and ASA in a non-trial setting. Methods: We conducted a retrospective registry-based cohort study of adults on DOAC or warfarin therapy for VTE and/or NVAF. Warfarin treated patients were followed by six anticoagulation clinics. Four out of the six clinics contributed data on their patients that were on DOACs in the Michigan Anticoagulation Quality Improvement Initiative (MAQI 2) from January 2009 to June 2021. Patients were excluded if they had a history of heart valve replacement, recent myocardial infarction, or less than 3 months of follow-up. Two propensity matched cohorts (warfarin+ASA vs DOAC+ASA) of patients were analyzed based on ASA use at the time of study enrollment. The primary outcome was any new bleeding event. Secondary outcomes included new episodes of arterial or venous thrombosis, bleeding event type (major, fatal, life threatening, central nervous system, and non-major bleeding), emergency room visits, hospitalizations, transfusions, and death. Random chart audits were done to confirm the accuracy of the abstracted data. Event rates were compared using Poisson regression. Results: We identified a total of 1,139 patients on DOACs plus ASA and 4,422 patients on warfarin plus ASA. After propensity matching, we compared two groups of 1,114 matched patients. DOAC treated patients were predominately on apixaban (62.3%) and rivaroxaban (30.4%), most often at therapeutic doses (Table 1). Patients were largely (90.5%) on low dose ASA (≤ 100 mg). Patient demographics, co-morbidities, indication for anticoagulation, history of bleeding or clotting, medications, and duration of follow-up were well-balanced after matching. Patients were followed for a median of 11.7 months (interquartile range 4.4 and 34 months). Patients treated with DOAC+ASA had 2.4 thrombotic events per 100 patient years compared to 2.2 thrombotic events per 100 patient years with warfarin+ASA (P=0.78). There were no significant differences observed between groups by thrombotic subtype (stroke, transient ischemic attack, pulmonary embolism, deep vein thrombosis, table 1). Bleeding was also similar with 30.1 bleeding events per 100 patient years with DOAC+ASA compared to 27.8 bleeds per 100 patient years with warfarin+ASA (P=0.24). There were no significant differences by bleeding subtype (table 1). Hospitalizations for clotting occurred less frequently with DOAC+ASA (0.9 hospitalizations per 100 patient years) compared to warfarin+ASA (1.7 hospitalizations per 100 patient years, P=0.03). Mortality, transfusions, and healthcare utilization were otherwise similar between the two groups. Conclusions: For patients on a DOAC versus warfarin with ASA for atrial fibrillation and/or venous thromboembolic disease without a recent myocardial infarction or heart valve replacement, bleeding and thrombotic outcomes were similar
    corecore