68 research outputs found

    Refractive Changes Induced by Strabismus Corrective Surgery in Adults

    Get PDF
    Purpose. To investigate refractive changes after strabismus correction procedures among adults. Methods. Retrospective chart review of adult patients who had horizontal recti muscles surgery with preoperative and postoperative cycloplegic refraction measurements. The preoperative refraction was mathematically subtracted from the postoperative refraction, and the induced refractive changes were statistically analyzed. Vector analysis was used to examine the magnitude of the toric change. The proportion of clinically significant refractive change was evaluated as well. Results. Thirty-one eyes from 22 subjects met the criteria and were included in the final analysis. A significant postoperative refractive change of the spherical equivalent towards myopia and a change of the astigmatism in the with-the-rule direction were observed. In a subset of 9 cases a third cycloplegic refraction measurement demonstrated stable refraction compared to the 1-month postoperative measurement. In 10 cases of single eye surgery, significant refractive changes were observed only in the operated side when compared to the sound eye. The induced surgical refractive change was of clinical significance (≥0.5 D) in 11 eyes of 9 patients (40.9% of patients). Conclusions. Refractive changes are a significant side effect of horizontal strabismus corrective surgery among adults. Therefore, patients should be informed about it prior to surgery and should be rerefracted in the postoperative period

    Dietary supplement consumption among cardiac patients admitted to internal medicine and cardiac wards

    Get PDF
    Background: Dietary supplements may have adverse effects and potentially interact with conventional medications. They are perceived as “natural” products, free of side effects with no need for medical consultation. Little is known about consumption of dietary supplements by patients with cardiac diseases. The objective of this study was to investigate dietary supplement consumption among cardiac patients admitted to internal and cardiology wards. Potential drug-dietary supplement interactions were also assessed. Methods: During a period of 6 months, patients with cardiac disease hospitalized in the Internal Medicine and Cardiology Wards at Assaf Harofeh Medical Center were evaluated regarding their dietary supplement consumption. A literature survey examining possible drug-supplement interaction was performed. Results: Out of 149 cardiac patients, 45% were dietary supplement consumers. Patients ad­mitted to the Internal Medicine Wards consumed more dietary supplements than those admit­ted to the Cardiology Division. Dietary supplement consumption was associated with older age (OR = 1.05, p = 0.022), female gender (OR = 2.94, p = 0.014) and routine physical activity (OR = 3.15, p = 0.007). Diabetes mellitus (OR = 2.68, p = 0.020), hematological diseases (OR = 13.29, p = 0.022), and the use of anti-diabetic medications (OR = 4.28, p = 0.001) were independently associated with dietary supplement intake. Sixteen potential moderate interactions between prescribed medications and dietary supplements were found. Conclusions: Consumption of dietary supplements is common among cardiac patients. It is more common in those admitted to Internal Medicine Departments than in those admitted to the Cardiology Wards. Due to the risk of various drug-supplement interactions consumed by patients with cardiac diseases, there is a need to increase awareness and knowledge among medical staff regarding the intake of dietary supplements

    Oral Cannabidiol Use in Children With Autism Spectrum Disorder to Treat Related Symptoms and Co-morbidities

    Get PDF
    Objective: Children with autism spectrum disorder (ASD) commonly exhibit comorbid symptoms such as aggression, hyperactivity and anxiety. Several studies are being conducted worldwide on cannabidiol use in ASD; however, these studies are still ongoing, and data on the effects of its use is very limited. In this study we aimed to report the experience of parents who administer, under supervision, oral cannabinoids to their children with ASD.Methods: After obtaining a license from the Israeli Ministry of Health, parents of children with ASD were instructed by a nurse practitioner how to administer oral drops of cannabidiol oil. Information on comorbid symptoms and safety was prospectively recorded biweekly during follow-up interviews. An independent group of specialists analyzed these data for changes in ASD symptoms and drug safety.Results: 53 children at a median age of 11 (4–22) year received cannabidiol for a median duration of 66 days (30–588). Self-injury and rage attacks (n = 34) improved in 67.6% and worsened in 8.8%. Hyperactivity symptoms (n = 38) improved in 68.4%, did not change in 28.9% and worsened in 2.6%. Sleep problems (n = 21) improved in 71.4% and worsened in 4.7%. Anxiety (n = 17) improved in 47.1% and worsened in 23.5%. Adverse effects, mostly somnolence and change in appetite were mild.Conclusion: Parents’ reports suggest that cannabidiol may improve ASD comorbidity symptoms; however, the long-term effects should be evaluated in large scale studies

    The Burden of Cryptosporidium Diarrheal Disease among Children < 24 Months of Age in Moderate/High Mortality Regions of Sub-Saharan Africa and South Asia, Utilizing Data from the Global Enteric Multicenter Study (GEMS).

    Get PDF
    Background: The importance of Cryptosporidium as a pediatric enteropathogen in developing countries is recognized. Methods: Data from the Global Enteric Multicenter Study (GEMS), a 3-year, 7-site, case-control study of moderate-to-severe diarrhea (MSD) and GEMS-1A (1-year study of MSD and less-severe diarrhea [LSD]) were analyzed. Stools from 12,110 MSD and 3,174 LSD cases among children aged <60 months and from 21,527 randomly-selected controls matched by age, sex and community were immunoassay-tested for Cryptosporidium. Species of a subset of Cryptosporidium-positive specimens were identified by PCR; GP60 sequencing identified anthroponotic C. parvum. Combined annual Cryptosporidium-attributable diarrhea incidences among children aged <24 months for African and Asian GEMS sites were extrapolated to sub-Saharan Africa and South Asian regions to estimate region-wide MSD and LSD burdens. Attributable and excess mortality due to Cryptosporidium diarrhea were estimated. Findings: Cryptosporidium was significantly associated with MSD and LSD below age 24 months. Among Cryptosporidium-positive MSD cases, C. hominis was detected in 77.8% (95% CI, 73.0%-81.9%) and C. parvum in 9.9% (95% CI, 7.1%-13.6%); 92% of C. parvum tested were anthroponotic genotypes. Annual Cryptosporidium-attributable MSD incidence was 3.48 (95% CI, 2.27–4.67) and 3.18 (95% CI, 1.85–4.52) per 100 child-years in African and Asian infants, respectively, and 1.41 (95% CI, 0.73–2.08) and 1.36 (95% CI, 0.66–2.05) per 100 child-years in toddlers. Corresponding Cryptosporidium-attributable LSD incidences per 100 child-years were 2.52 (95% CI, 0.33–5.01) and 4.88 (95% CI, 0.82–8.92) in infants and 4.04 (95% CI, 0.56–7.51) and 4.71 (95% CI, 0.24–9.18) in toddlers. We estimate 2.9 and 4.7 million Cryptosporidium-attributable cases annually in children aged <24 months in the sub-Saharan Africa and India/Pakistan/Bangladesh/Nepal/Afghanistan regions, respectively, and ~202,000 Cryptosporidium-attributable deaths (regions combined). ~59,000 excess deaths occurred among Cryptosporidium-attributable diarrhea cases over expected if cases had been Cryptosporidium-negative. Conclusions: The enormous African/Asian Cryptosporidium disease burden warrants investments to develop vaccines, diagnostics and therapies

    Reduced hospitalization rates are not associated with increased mortality or readmission rates in an emergency department in Israel

    No full text
    Abstract Background and Aim In 2011 the Israeli Ministry of Health (MOH) instructed hospitals to limit occupancy in the internal medicine wards to 120%, which was followed by a nationwide reduction in hospitalization rates. We examined how readmission and mortality rates changed in the five years following the changes in occupancy rates and hospitalization rates. Methods All visits to the Tel Aviv Medical Center internal Emergency Medicine Department (ED) in 2010, 2014 and 2016 were captured, with exclusion of visits by patients below 16 of age and patients with incomplete or faulty data. The main outcomes were one-week readmission rates and one-month death rates. The secondary outcomes were admission rate, ED visit length & admission-delay time (minutes), and rates of admission-delayed patients. Results After exclusion, a total of 168,891 internal medicine ED patients were included in the analysis. Mean age was 58.0 and 49% were males. During the relevant period (2010–2016), total medical ED visits increased by 11% - 53,327, 56,588 and 59,066 in 2010, 2014 and 2016 respectively. Hospitalization rates decreased from 46% in 2010 to 35% in 2015 (p < 0.001), with the most prominent reduction in the elderly population. One-week readmission rates were 6.5, 6.4 and 6.7% in 2010, 2014 and 2016 respectively (p = 0.347 and p = 0.21). One-month mortality was similar in 2010 and 2014 (4.4 and 4.5%, p = 0.388) and lower in 2016 (4.1%, p = 0.048 compared with 2010). Average ED visit length increased from 184 min in 2010 to 238 and 262 min in 2014 & 2016 (p < 0.001 for both) and average delay time to ward admission increased from 97 min in 2010 to 179 and 240 in 2014 & 2016 (p < 0.001 for both). In 2010 24% of the admitted patients were delayed in the ED more than 2 h, numbers that increased to 53% in 2014 and 66% in 2016 (p < 0.001 for both). Conclusion Following the 2011 MOH’s decision to establish a 120% occupancy limit for internal medicine wards along with natural growth in population volume, significant changes were noted in the work of a large, presumably representative emergency department in Israel. Although a steady increase in total ED visits along with a steady reduction in hospitalization rates were observed, the readmission and mortality rates remained low. The increase in the average length of ED visits and in the delay from ED admission to a ward reflects higher burden on the ED. The study was not able to establish a causal connection between the MOH directive and the subsequent changes in ED activity. Nonetheless, the study has significant potential implications for policy makers, including the presence of senior ED physicians during afterhours, creation of short-stay diagnostic units and proper adjustments in ED size and personnel

    A simple-to-use nomogram to predict long term survival of patients undergoing coronary artery bypass grafting (CABG) using bilateral internal thoracic artery grafting technique.

    No full text
    BACKGROUND:Several risk scores have been created to predict long term mortality after coronary artery bypass grafting (CABG). Several studies demonstrated a reduction in long-term mortality following bilateral internal thoracic arteries (BITA) compared to single internal thoracic artery. However, these prediction models usually referred to long term survival as survival of up to 5 years. Moreover, none of these models were built specifically for operation incorporating BITA grafting. METHODS:A historical cohort study of all patients who underwent isolated BITA grafting between 1996 and 2011 at Tel-Aviv Sourasky medical center, a tertiary referral university affiliated medical center with a 24-bed cardio-thoracic surgery department. Study population (N = 2,935) was randomly divided into 2 groups: learning group which was used to build the prediction model and validation group. Cox regression was used to predict death using pre-procedural risk factors (demographic data, patient comorbidities, cardiac characteristics and patient's status). The accuracy (discrimination and calibration) of the prediction model was evaluated. METHODS AND FINDINGS:The learning (1,468 patients) and validation (1,467 patients) groups had similar preoperative characteristics and similar survival. Older age, diabetes mellitus, chronic obstructive lung disease, congestive heart failure, chronic renal failure, old MI, ejection fraction ≤30%, pre-operative use of intra-aortic balloon, and peripheral vascular disease, were significant predictors of mortality and were used to build the prediction model. The area under the ROC curves for 5, 10, and 15-year survival ranged between 0.742 and 0.762 for the learning group and between 0.766 and 0.770 for the validation group. The prediction model showed good calibration performance in both groups. A nomogram was built in order to introduce a simple-to-use tool for prediction of 5, 10, and 15-year survival. CONCLUSIONS:A simple-to-use validated model can be used for a prediction of 5, 10, and 15-year mortality after CABG using the BITA grafting technique

    Long-term biochemical progression-free survival following brachytherapy for prostate cancer: Further insight into the role of short-term androgen deprivation and intermediate risk group subclassification.

    No full text
    IntroductionBrachytherapy is a well-established treatment of localized prostate cancer. Few studies have documented long-term results, specifically biochemical progression-free survival (bPFS) in men with brachytherapy alone, with or without short-term androgen deprivation therapy (ADT), or in combination with external beam radiotherapy (EBRT). Our aim was to analyze long-term bPFS of brachytherapy treated patients.Materials and methodsRetrospective analysis of 1457 patients with low and intermediate risk prostate cancer treated with brachytherapy alone (1255) or combined with EBRT (202). Six-months ADT was administrated for all EBRT combined patients and for prostate volume downsizing when >55 cc (328). Failure was by the Phoenix definition. Kaplan-Meier analysis and multivariate Cox regression estimated and compared 10-yr and 15-yr rates of bPFS.ResultsMedian follow-up was 6.1 yr. Ten and 15-yr bPFS rates of the entire cohort were 93.2% and 89.2%, respectively. On multivariate analysis, PSA density (PSAD), ADT and clinical stage were significantly associated with failure. The most powerful independent factor was PSAD with a HR of 3.5 (95% CI, 1.7-7.4) for PSAD above 0.15. No significant difference was found between low and intermediate risks patients regardless of treatment regimen. However, comparison of two intermediate risk groups, Gleason score (GS) 7, PSAConclusionsOur retrospective large scale study suggests that brachytherapy provides excellent long-term bPFS rates in low and intermediate risk disease. Combination of brachytherapy with EBRT yields favorable outcomes in GS 7 intermediate risk patients and short-term ADT has a positive effect on outcomes in low risk patients. Further prospective studies are warranted to discriminate the role of adding either EBRT and/or ADT to brachytherapy protocols

    Age-Dependent Biomarkers for Prediction of In-Hospital Mortality in COVID-19 Patients

    No full text
    Background: Several biomarkers and models have been proposed to predict in-hospital mortality among COVID-19 patients. However, these studies have not examined the association in sub-populations. The present study aimed to identify the association between the two most common inflammatory biomarkers in the emergency department and in-hospital mortality in subgroups of patients. Methods: A historical cohort study of adult patients who were admitted to acute-care hospital between March and December 2020 and had a diagnosis of COVID-19 infection. Data on age, sex, Charlson comorbidity index, white blood cell (WBC) count, C-reactive protein (CRP), and in-hospital mortality were collected. Discrimination ability of each biomarker was observed and the CHAID method was used to identify the association in subgroups of patients. Results: Overall, 762 patients (median age 70.9 years, 59.7% males) were included in the study. Of them, 25.1% died during hospitalization. In-hospital mortality was associated with higher CRP (median 138 mg/L vs. 85 mg/L, p &lt; 0.001), higher WBC count (median 8.5 vs. 6.6 K/&micro;L, p &lt; 0.001), and higher neutrophil-to-lymphocyte ratio (NLR) (median 9.2 vs. 5.4, p &lt; 0.001). The area under the ROC curve was similar among all biomarkers (WBC 0.643, NLR 0.677, CRP 0.646, p &gt; 0.1 for all comparisons). The CHAID method revealed that WBC count was associated with in-hospital mortality in patients aged 43.1&ndash;66.0 years (&lt;11 K/&micro;L: 10.1% vs. 11+ K/&micro;L: 27.9%), NLR in patients aged 66.1&ndash;80 years (&le;8: 15.7%, &gt;8: 43.3%), and CRP in patients aged 80.1+ years (&le;47 mg/L: 18.8%, 47.1&ndash;149 mg/L: 43.1%, and 149.1+: 71.7% mortality). Conclusions: WBC, NLR, and CRP present similar discrimination abilities. However, each biomarker should be considered as a predictor for in-hospital mortality in different age groups
    • …
    corecore