17 research outputs found

    Inter-Observer Variation in the Pathologic Identification of Minimal Extrathyroidal Extension in Papillary Thyroid Carcinoma

    Full text link
    Background: Extrathyroidal extension (ETE) is a significant prognostic factor in papillary thyroid carcinoma (PTC). Minimal extrathyroidal extension (mETE) is characterized by involvement of the sternothyroid muscle or perithyroid soft tissue, and is generally identified by light microscope examination. Patients with mETE, identified pathologically, are automatically upstaged to pT3. However, the prognostic implications of mETE have been a source of controversy in the literature. Moreover, there is also controversy surrounding the identification of mETE on pathological specimens. The objective of this study was to determine the level of agreement among expert pathologists in the identification of mETE in PTC cases. Methods: Eleven expert pathologists from the United States, Italy, and Canada were asked to perform a review of 69 scanned slides of representative permanent sections of PTC specimens. Each slide was evaluated for the presence of mETE. The pathologists were also asked to list the criteria they use to identify mETE. Results: The overall strength of agreement for identifying mETE was slight (??=?0.14). Inter-pathologist agreement was best for perithyroidal skeletal muscle involvement (??=?0.46, moderate agreement) and worst for invasion around thick-walled vascular structures (??=?0.02, slight agreement). In addition, there was disagreement over the constellation of histologic features that are diagnostic for mETE, which affected overall agreement for diagnosing mETE. Conclusions: Overall agreement for the identification of mETE is poor. Disagreement is a result of both variation in individual pathologists' interpretations of specimens and disagreement on the histologic criteria for mETE. Thus, the utility of mETE in staging and treatment of PTC is brought into question. The lack of concordance may explain the apparent lack of agreement regarding the prognostic significance of this pathologic feature.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/140271/1/thy.2015.0508.pd

    Inter-Observer Variation in the Pathologic Identification of Extranodal Extension in Nodal Metastasis from Papillary Thyroid Carcinoma

    Full text link
    Background: Extranodal extension (ENE) in lymph node metastases has been shown to worsen the prognosis of papillary thyroid cancer (PTC). Despite the clinical significance of ENE, there are no stringent criteria for its microscopic diagnosis, and its identification is subject to inter-observer variability. The objective of this study was to determine the level of agreement among expert pathologists in the identification of ENE in PTC cases. Methods: Eleven expert pathologists from the United States, Italy, and Canada were asked to review 61 scanned slides of representative permanent sections of PTC specimens from Mount Sinai Beth Israel Medical Center in New York. Each slide was evaluated for the presence of ENE. The pathologists were also asked to report the criteria they use to identify ENE. Results: The overall strength of agreement in identifying ENE was only fair (??=?0.35), and the proportion of observed agreement was 0.68. The proportions of observed agreement for the identification of perinodal structures (fat, nerve, skeletal, and thick-walled vessel involvement) ranged from 0.61 to 0.997. Conclusions: Overall agreement for the identification of ENE is poor. The lack of agreement results from both variation in pathologists' identification of features and disagreement on the histologic criteria for ENE. This lack of concordance may help explain some of the discordant information regarding prognosis in clinical studies when this feature is identified.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/140272/1/thy.2015.0551.pd

    Large expert-curated database for benchmarking document similarity detection in biomedical literature search

    Get PDF
    Document recommendation systems for locating relevant literature have mostly relied on methods developed a decade ago. This is largely due to the lack of a large offline gold-standard benchmark of relevant documents that cover a variety of research fields such that newly developed literature search techniques can be compared, improved and translated into practice. To overcome this bottleneck, we have established the RElevant LIterature SearcH consortium consisting of more than 1500 scientists from 84 countries, who have collectively annotated the relevance of over 180 000 PubMed-listed articles with regard to their respective seed (input) article/s. The majority of annotations were contributed by highly experienced, original authors of the seed articles. The collected data cover 76% of all unique PubMed Medical Subject Headings descriptors. No systematic biases were observed across different experience levels, research fields or time spent on annotations. More importantly, annotations of the same document pairs contributed by different scientists were highly concordant. We further show that the three representative baseline methods used to generate recommended articles for evaluation (Okapi Best Matching 25, Term Frequency-Inverse Document Frequency and PubMed Related Articles) had similar overall performances. Additionally, we found that these methods each tend to produce distinct collections of recommended articles, suggesting that a hybrid method may be required to completely capture all relevant articles. The established database server located at https://relishdb.ict.griffith.edu.au is freely available for the downloading of annotation data and the blind testing of new methods. We expect that this benchmark will be useful for stimulating the development of new powerful techniques for title and title/abstract-based search engines for relevant articles in biomedical research.Peer reviewe

    The IDENTIFY study: the investigation and detection of urological neoplasia in patients referred with suspected urinary tract cancer - a multicentre observational study

    Get PDF
    Objective To evaluate the contemporary prevalence of urinary tract cancer (bladder cancer, upper tract urothelial cancer [UTUC] and renal cancer) in patients referred to secondary care with haematuria, adjusted for established patient risk markers and geographical variation. Patients and Methods This was an international multicentre prospective observational study. We included patients aged ≥16 years, referred to secondary care with suspected urinary tract cancer. Patients with a known or previous urological malignancy were excluded. We estimated the prevalence of bladder cancer, UTUC, renal cancer and prostate cancer; stratified by age, type of haematuria, sex, and smoking. We used a multivariable mixed-effects logistic regression to adjust cancer prevalence for age, type of haematuria, sex, smoking, hospitals, and countries. Results Of the 11 059 patients assessed for eligibility, 10 896 were included from 110 hospitals across 26 countries. The overall adjusted cancer prevalence (n = 2257) was 28.2% (95% confidence interval [CI] 22.3–34.1), bladder cancer (n = 1951) 24.7% (95% CI 19.1–30.2), UTUC (n = 128) 1.14% (95% CI 0.77–1.52), renal cancer (n = 107) 1.05% (95% CI 0.80–1.29), and prostate cancer (n = 124) 1.75% (95% CI 1.32–2.18). The odds ratios for patient risk markers in the model for all cancers were: age 1.04 (95% CI 1.03–1.05; P < 0.001), visible haematuria 3.47 (95% CI 2.90–4.15; P < 0.001), male sex 1.30 (95% CI 1.14–1.50; P < 0.001), and smoking 2.70 (95% CI 2.30–3.18; P < 0.001). Conclusions A better understanding of cancer prevalence across an international population is required to inform clinical guidelines. We are the first to report urinary tract cancer prevalence across an international population in patients referred to secondary care, adjusted for patient risk markers and geographical variation. Bladder cancer was the most prevalent disease. Visible haematuria was the strongest predictor for urinary tract cancer

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    Farmed Cricket (Acheta domesticus, Gryllus assimilis, and Gryllodes sigillatus; Orthoptera) Welfare Considerations: Recommendations for Improving Global Practice

    No full text
    Orthoptera, such as crickets, is currently the most reared group of hemimetabolous insects in the insects as food and feed industry, with over 370 billion individuals slaughtered and/or sold live annually. The most-farmed cricket species is Acheta domesticus, however there is growing interest in farming at least two additional species, Gryllus assimilis and Gryllodes sigillatus. Crickets are largely being explored for use as human protein, and exotic animal or pet feed – as well as, to a lesser extent, livestock and fish feed. Insect welfare is of growing interest to consumers who are considering incorporating insect protein into their diets, as well as to many producers. However, no studies have considered the welfare concerns of farmed crickets under current industry conditions. Using an established model for assessing farmed insect welfare, we assess potential welfare concerns for the three most-farmed cricket species, including: interspecific interactions (including parasites and pathogens), temperature and humidity, light cycles, electrical shocks, atmospheric gas levels, nutrition and hydration, environmental pollutants, injury and crowding, density, handling-associated stress, genetics and selection, enrichments, transport-related challenges, and stunning, anesthesia, and slaughter/depopulation methods. From our assessment of these factors, we make recommendations for improving cricket welfare now and as the industry continues to grow; in addition, we identify research directions that will improve our understanding of cricket welfare. We conclude by broadly discussing the importance of addressing the welfare challenges presented by the insects as food and feed industry for the animals and for the growth and health of the industry itself.Rethink Priorities provided funding to all authors for the researching, writing, and/or editing of this work
    corecore