20 research outputs found

    World Congress Integrative Medicine & Health 2017: Part one

    Get PDF

    Is There a Role for Molecular Testing for Low-Risk Differentiated Thyroid Cancer? A Cost-Effectiveness Analysis

    No full text
    Molecular testing for thyroid nodules has been rapidly developed in recent years, aiming to predict the presence of malignancy and aggressive features. While commonly utilized to predict malignancy, its role in guiding the management approach is still developing. The high cost of genetic tests and long-term sequences of thyroid cancer is limiting to real-life studies. Objective: To evaluate the cost effectiveness of molecular testing for low-risk differentiated thyroid cancer (lrDTC). Methods: We developed a Markovian decision tree model of a simulated lrDTC cohort, comparing two management strategies: (I) Conducting genetic tests (GT)—patients are stratified into three risk groups for distant metastasis by the identified molecular markers: low-, intermediate- and high-risk molecular profile; followed by management accordingly: patients with low-risk will undergo hemithyroidectomy (HT), patients with intermediate-risk will undergo total thyroidectomy (TT), and high-risk patients will undergo TT with central neck dissection; (II) Without genetic tests (wGT)—all patients will undergo HT according to the ATA recommendations for lrDTC. Outcomes were measured as quality-adjusted life years (QALYs) and costs of each strategy. Results: GT was found as cost effective, leading to a gain of 1.7 QALYs with an additional cost of 327perpatientcomparedtowGTstrategy.Thisyieldedanincrementalcosteffectivenessratioof327 per patient compared to wGT strategy. This yielded an incremental cost-effectiveness ratio of 190 per QALY. Sensitivity analysis demonstrated robust results across the variables’ ranges. The most impactful variable was the benefit from performing TT rather than HT for intermediate to high-risk patients. Conclusions: Our model found that molecular testing for lrDTC is cost-effective, allowing tailored management according to the patient’s personal risk level reflected in the genetic profile, hence improving outcomes

    Is There a Role for Molecular Testing for Low-Risk Differentiated Thyroid Cancer? A Cost-Effectiveness Analysis

    No full text
    Molecular testing for thyroid nodules has been rapidly developed in recent years, aiming to predict the presence of malignancy and aggressive features. While commonly utilized to predict malignancy, its role in guiding the management approach is still developing. The high cost of genetic tests and long-term sequences of thyroid cancer is limiting to real-life studies. Objective: To evaluate the cost effectiveness of molecular testing for low-risk differentiated thyroid cancer (lrDTC). Methods: We developed a Markovian decision tree model of a simulated lrDTC cohort, comparing two management strategies: (I) Conducting genetic tests (GT)—patients are stratified into three risk groups for distant metastasis by the identified molecular markers: low-, intermediate- and high-risk molecular profile; followed by management accordingly: patients with low-risk will undergo hemithyroidectomy (HT), patients with intermediate-risk will undergo total thyroidectomy (TT), and high-risk patients will undergo TT with central neck dissection; (II) Without genetic tests (wGT)—all patients will undergo HT according to the ATA recommendations for lrDTC. Outcomes were measured as quality-adjusted life years (QALYs) and costs of each strategy. Results: GT was found as cost effective, leading to a gain of 1.7 QALYs with an additional cost of 327perpatientcomparedtowGTstrategy.Thisyieldedanincrementalcosteffectivenessratioof327 per patient compared to wGT strategy. This yielded an incremental cost-effectiveness ratio of 190 per QALY. Sensitivity analysis demonstrated robust results across the variables’ ranges. The most impactful variable was the benefit from performing TT rather than HT for intermediate to high-risk patients. Conclusions: Our model found that molecular testing for lrDTC is cost-effective, allowing tailored management according to the patient’s personal risk level reflected in the genetic profile, hence improving outcomes

    A Five-Decade Text Mining Analysis of Cochlear Implant Research: Where We Started and Where We Are Heading

    No full text
    Background and Objectives: Since its invention in the 1970s, the cochlear implant (CI) has been substantially developed. We aimed to assess the trends in the published literature to characterize CI. Materials and Methods: We queried PubMed for all CI-related entries published during 1970–2022. The following data were extracted: year of publication, publishing journal, title, keywords, and abstract text. Search terms belonged to the patient’s age group, etiology for hearing loss, indications for CI, and surgical methodological advancement. Annual trends of publications were plotted. The slopes of publication trends were calculated by fitting regression lines to the yearly number of publications. Results: Overall, 19,428 CIs articles were identified. Pediatric-related CI was the most dominant sub-population among the age groups, with the highest rate and slope during the years (slope 5.2 ± 0.3, p p p Conclusions: Publications regarding CI among pediatrics outnumbered all other indications, supporting the rising, pivotal role of CI in the rehabilitation of children with sensorineural hearing loss. Hearing-preservation publications have recently rapidly risen, identified as the primary trend of the current era, followed by a sharp rise of robotic surgery that is evolving and could define the next revolution

    Whole-tree water balance and indicators for short-term drought stress in non-bearing 'Barnea' olives

    No full text
    Drainage-weighing lysimeters allowed monitoring of water balance components of non-bearing olive (Olea europaea cv Barnea) trees over a 3-month period including short-term events of controlled but severe water stress. The objective of the study was to evaluate a variety of soil and plant-based water status and drought stress monitoring methods on the basis of tree-scale evapotranspiration (ET). As the trees entered into and recovered from water stress, meteorological data, actual ET (ETa), soil water content and changes in leaf turgor pressure were continuously monitored. Additionally, midday measurements of stem water potential, stomatal conductance, canopy temperature, and quantum yield of PSII photochemistry were conducted. Diurnal (dawn to dusk) measurements of all the above were made hourly on days of maximum stress. Shoot elongation rate was measured for periods of stress and recovery. Quantum yield of PSII photochemistry, stomatal conductance, and stem water potential all successfully indicated reductions in whole-tree water consumption beginning at moderate stress levels. These measured parameters fully recovered to the levels of non-stressed trees soon after water application was renewed. Shoot elongation was reduced 25-30% for the 10-day period during and following drought and recovered thereafter to levels of non-stressed trees. Whole-tree ETa was reduced by as much as 20% even following full recovery of the leaf level parameters, suggesting reduced canopy size and growth due to the stress period. Non-destructive, continuous (turgor pressure) and remotely sensed (canopy temperature) methods showed promising potential for monitoring effects of water stress, in spite of technological and data interpretation challenges requiring further attention.Lysimeter Plant growth Transpiration Physiological monitoring Water status Olea europaea
    corecore