106 research outputs found

    Effectiveness of Compounded Bioidentical Hormone Replacement Therapy: An Observational Cohort Study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Bioidentical Hormone Replacement Therapy (BHRT) is believed it to be a safer and equally effective alternative to Conventional Hormone Therapy for the relief of menopausal symptoms; however, data are needed to support these claims. The objective of this study is to evaluate the effectiveness of compounded BHRT provided in six community pharmacies.</p> <p>Methods</p> <p>This was an observational cohort study of women between the ages of 18-89 who received a compounded BHRT product from January 1, 2003 to April 30, 2010 in six community pharmacies. Data included patient demographics, comorbidities, therapeutic outcomes, and hormone therapies. Women self-rated menopausal symptoms as absent, mild, moderate, or severe. Descriptive statistics were used to characterize the patient population, BHRT use, and adverse events. Patient symptom severity was compared at baseline and 3 to 6 months follow-up using the Wilcoxon signed-rank test.</p> <p>Results</p> <p>Women (n = 296) receiving BHRT at Oakdell Pharmacy had a mean (standard deviation) age of 52 (9) years. The most common BHRT dosage forms utilized were topical (71%) and oral (43%). Compounded BHRT regimens were generally initiated at low doses regardless of route. Women experienced a 25% decrease in emotional lability (p < 0.01), a 25% decrease in irritability (p < 0.01), and a 22% reduction in anxiety (p = 0.01) within 3 to 6 months. These women also experienced a 14% reduction in night sweats (p = 0.09) and a 6% reduction in hot flashes (p = 0.50).</p> <p>Conclusions</p> <p>This study demonstrates that compounded BHRT improves mood symptoms. Larger studies are needed to examine the impact on vasomotor symptoms, myocardial infarction and breast cancer.</p

    Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy

    Get PDF
    Background A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets. Methods Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis. Results A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001). Conclusion We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty

    Maximal-effort cytoreductive surgery for ovarian cancer patients with a high tumor burden: variations in practice and impact on outcome

    Get PDF
    Background This study aimed to compare the outcomes of two distinct patient populations treated within two neighboring UK cancer centers (A and B) for advanced epithelial ovarian cancer (EOC). Methods A retrospective analysis of all new stages 3 and 4 EOC patients treated between January 2013 and December 2014 was performed. The Mayo Clinic surgical complexity score (SCS) was applied. Cox regression analysis identified the impact of treatment methods on survival. Results The study identified 249 patients (127 at center A and 122 in centre B) without significant differences in International Federation of Gynecology and Obstetrics (FIGO) stage (FIGO 4, 29.7% at centers A and B), Eastern Cooperative Oncology Group (ECOG) performance status (ECOG < 2, 89.9% at centers A and B), or histology (serous type in 84.1% at centers A and B). The patients at center A were more likely to undergo surgery (87% vs 59.8%; p < 0.001). The types of chemotherapy and the patients receiving palliative treatment alone were equivalent between the two centers (3.6%). The median SCS was significantly higher at center A (9 vs 2; p < 0.001) with greater tumor burden (9 vs 6 abdominal fields involved; p < 0.001), longer median operation times (285 vs 155 min; p < 0.001), and longer hospital stays (9 vs 6 days; p < 0.001), but surgical morbidity and mortality were equivalent. The independent predictors of reduced overall survival (OS) were non-serous histology (hazard ratio [HR], 1.6; 95% confidence interval [CI] 1.04–2.61), ECOG higher than 2 (HR, 1.9; 95% CI 1.15–3.13), and palliation alone (HR, 3.43; 95% CI 1.51–7.81). Cytoreduction, of any timing, had an independent protective impact on OS compared with chemotherapy alone (HR, 0.31 for interval surgery and 0.39 for primary surgery), even after adjustment for other prognostic factors. Conclusions Incorporating surgery into the initial EOC management, even for those patients with a greater tumor burden and more disseminated disease, may require more complex procedures and more resources in terms of theater time and hospital stay, but seems to be associated with a significant prolongation of the patients overall survival compared with chemotherapy alone. Maximal-effort cytoreductive surgery aimed at total macroscopic tumor clearance combined with platinum-based chemotherapy and targeted agents is the cornerstone of modern primary epithelial ovarian cancer (EOC) management.1 Although findings have shown high tumor burden to be associated with a less favorable overall outcome than more advantageous tumor dissemination patterns with less disease,2 multiple prospective and retrospective series have long demonstrated a strong positive association between total macroscopic tumor clearance rates and survival, not only on an individual basis but also at the level of large patient cohorts, in which individual tumor biology-related factors are less likely to skew collective survival data.1,3, 4, 5, 6, 7, 8 The team of Chi et al. recently presented the survival data for all advanced EOC patients treated at Memorial Sloan Kettering categorized by year of primary debulking surgery based on the implementation of surgical changes in their approach to ovarian cancer debulking. Their study demonstrated that complete gross resection rates, progression-free survival (PFS) and overall survival (OS) increased during the 13-year evaluation period despite operating on higher-stage disease and patients with a greater tumor burden. This was assumed to be largely attributable to the surgical paradigm shifts implemented specifically to achieve more complete surgical cytoreduction, even for patients with a less favorable disease profile.4 Nevertheless, as with all medical and surgical advances, their broader implementation varies greatly nationally and internationally, not just due to differences in the available resources, but also because of long-established local practice and broad disparities in overall philosophy as well as in individual and infrastructural expertise.3,6,8,9 Especially for patients with a high tumor burden, in which therapeutic effort often is challenged, not only by the disease itself but also by the impact that this advanced disease has on the actual patient, both personal and infrastructural resources and expertise often are stretched, and hence reasonable doubt arises about the limitations and limits of optimal treatment.2,3,6 The current analysis aimed to demonstrate how differences in local practice may influence the patient’s outcome by evaluating not only the surgical patients, but also the entire EOC cohort treated at one of two large UK cancer centers in an attempt to exclude a selection bias of seemingly more favorable and operable patients7,10,11 and have all ovarian cancer patients in the denominator, including those women with more adverse tumor profiles and higher tumor load

    Inheritance of resistance to fusarium wilt in chickpea

    Get PDF
    This preliminary study indicated that the resistance to race 2 of fusarium wilt is controlled by two genes, the first of which must be present in the homozygous recessive form, and the other in the dominant form, whether homozygous or heterozygous for complete resistance. Early wilting results if the other gene is homozygous recessive. Late wilting occurs if both loci are dominant. The existence of differences among chickpea cultivars in the time taken to express the initial symptoms of fusarium wilt were observe

    Population‐based cohort study of outcomes following cholecystectomy for benign gallbladder diseases

    Get PDF
    Background The aim was to describe the management of benign gallbladder disease and identify characteristics associated with all‐cause 30‐day readmissions and complications in a prospective population‐based cohort. Methods Data were collected on consecutive patients undergoing cholecystectomy in acute UK and Irish hospitals between 1 March and 1 May 2014. Potential explanatory variables influencing all‐cause 30‐day readmissions and complications were analysed by means of multilevel, multivariable logistic regression modelling using a two‐level hierarchical structure with patients (level 1) nested within hospitals (level 2). Results Data were collected on 8909 patients undergoing cholecystectomy from 167 hospitals. Some 1451 cholecystectomies (16·3 per cent) were performed as an emergency, 4165 (46·8 per cent) as elective operations, and 3293 patients (37·0 per cent) had had at least one previous emergency admission, but had surgery on a delayed basis. The readmission and complication rates at 30 days were 7·1 per cent (633 of 8909) and 10·8 per cent (962 of 8909) respectively. Both readmissions and complications were independently associated with increasing ASA fitness grade, duration of surgery, and increasing numbers of emergency admissions with gallbladder disease before cholecystectomy. No identifiable hospital characteristics were linked to readmissions and complications. Conclusion Readmissions and complications following cholecystectomy are common and associated with patient and disease characteristics

    The development and validation of a scoring tool to predict the operative duration of elective laparoscopic cholecystectomy

    Get PDF
    Background: The ability to accurately predict operative duration has the potential to optimise theatre efficiency and utilisation, thus reducing costs and increasing staff and patient satisfaction. With laparoscopic cholecystectomy being one of the most commonly performed procedures worldwide, a tool to predict operative duration could be extremely beneficial to healthcare organisations. Methods: Data collected from the CholeS study on patients undergoing cholecystectomy in UK and Irish hospitals between 04/2014 and 05/2014 were used to study operative duration. A multivariable binary logistic regression model was produced in order to identify significant independent predictors of long (> 90 min) operations. The resulting model was converted to a risk score, which was subsequently validated on second cohort of patients using ROC curves. Results: After exclusions, data were available for 7227 patients in the derivation (CholeS) cohort. The median operative duration was 60 min (interquartile range 45–85), with 17.7% of operations lasting longer than 90 min. Ten factors were found to be significant independent predictors of operative durations > 90 min, including ASA, age, previous surgical admissions, BMI, gallbladder wall thickness and CBD diameter. A risk score was then produced from these factors, and applied to a cohort of 2405 patients from a tertiary centre for external validation. This returned an area under the ROC curve of 0.708 (SE = 0.013, p  90 min increasing more than eightfold from 5.1 to 41.8% in the extremes of the score. Conclusion: The scoring tool produced in this study was found to be significantly predictive of long operative durations on validation in an external cohort. As such, the tool may have the potential to enable organisations to better organise theatre lists and deliver greater efficiencies in care

    Introduction of Ophiobolus graminis into new polders and its decline

    Full text link

    The Cholecystectomy As A Day Case (CAAD) Score: A Validated Score of Preoperative Predictors of Successful Day-Case Cholecystectomy Using the CholeS Data Set

    Get PDF
    Background Day-case surgery is associated with significant patient and cost benefits. However, only 43% of cholecystectomy patients are discharged home the same day. One hypothesis is day-case cholecystectomy rates, defined as patients discharged the same day as their operation, may be improved by better assessment of patients using standard preoperative variables. Methods Data were extracted from a prospectively collected data set of cholecystectomy patients from 166 UK and Irish hospitals (CholeS). Cholecystectomies performed as elective procedures were divided into main (75%) and validation (25%) data sets. Preoperative predictors were identified, and a risk score of failed day case was devised using multivariate logistic regression. Receiver operating curve analysis was used to validate the score in the validation data set. Results Of the 7426 elective cholecystectomies performed, 49% of these were discharged home the same day. Same-day discharge following cholecystectomy was less likely with older patients (OR 0.18, 95% CI 0.15–0.23), higher ASA scores (OR 0.19, 95% CI 0.15–0.23), complicated cholelithiasis (OR 0.38, 95% CI 0.31 to 0.48), male gender (OR 0.66, 95% CI 0.58–0.74), previous acute gallstone-related admissions (OR 0.54, 95% CI 0.48–0.60) and preoperative endoscopic intervention (OR 0.40, 95% CI 0.34–0.47). The CAAD score was developed using these variables. When applied to the validation subgroup, a CAAD score of ≤5 was associated with 80.8% successful day-case cholecystectomy compared with 19.2% associated with a CAAD score >5 (p < 0.001). Conclusions The CAAD score which utilises data readily available from clinic letters and electronic sources can predict same-day discharges following cholecystectomy

    The perceived effects of the European working time directive upon training opportunities for specialist registrars in general surgery in the West Midlands

    No full text
    Background: There is concern in the medical literature that reduced work hours as a result of the European Working Time Directive (EWTD) is detrimental to surgical training due to reduction in workplace-based training opportunities. This is supported by literature suggesting that learning theories applicable to surgical training include social learning and constructivism, and that surgeons are ‘hands-on’, practical learners. However, there is no conclusive evidence that reduced hours is detrimental to surgical training, and this study aims to explore whether this is indeed the case. Methods: A series of one-to-one semi-structured interviews were performed with Year 5 and 6 Specialist Registrars in General Surgery on the West Midlands Higher Surgical Training scheme. Nine interviews were performed before thematic saturation was reached. Interview transcripts were then thematically analysed in NVivo 9. Results: Participants perceive the EWTD to have reduced training opportunities due to reduced hours, a change to working shifts as opposed to 24-hour on-calls and the introduction of timetabled days off into on-call rotas in order to make them EWTD-compliant, which is largely being used in order to gain further training opportunities. Trainees are attending courses and going on Fellowships in order to augment training. There is a difference in opinion as to what constitutes training and what constitutes service provision. Trainees perceive that shift-working leads to increased fatigue and a disruption to life outside of work. Conclusion: Overall perceptions are of a detrimental effect upon training opportunities for a variety of reasons, which is consistent with the current literature. New theory has been generated regarding the perceptions of service and training activities, and the differences between 24-hour on-calls and shifts upon fatigue, which could be explored further with quantitative methodologies
    corecore