581 research outputs found

    Survival and risk of adverse events in older patients receiving postoperative adjuvant chemotherapy for resected stages II-IIIA lung cancer: observational cohort study

    Get PDF
    Objective To compare the survival and risk of serious adverse events in older patients with stages II-IIIA non-small cell lung cancer treated with or without postoperative platinum based chemotherapy

    Komplex molekuláris genetikai vizsgálati algoritmus myeloproliferativ neoplasiák diagnosztikájában

    Get PDF
    Introduction: Mutations in Janus kinase 2, calreticulin and thrombopoietin receptor genes have been identified in the genetic background of Philadelphia chromosome negative, "classic" myeloproliferative neoplasms. Aim: The aim of the authors was to identify driver mutations in a large myeloproliferative cohort of 949 patients. Method: A complex array of molecular techniques (qualitative and quantitative allele-specific polymerase chain reactions, fragment analyzes, high resolution melting and Sanger sequencing) was applied. Results: All 354 patients with polycythemia vera carried Janus kinase 2 mutations (V617F 98.6%, exon 12: 1.4%). In essential thrombocythemia (n = 468), the frequency of V617F was 61.3% (n = 287), that of calreticulin 25.2% (n = 118), and that of thrombopoietin receptor mutations 2.1% (n = 10), while 11.3% (n = 53) were triple-negative. Similar distribution was observed in primary myelofibrosis (n = 127): 58.3% (n = 74) V617F, 23.6% (n = 30) calreticulin, 6.3% (n = 8) thrombopoietin receptor mutation positive and 11.8% (n = 15) triple-negative. Conclusions: The recent discovery of calreticulin gene mutations led to definite molecular diagnostics in around 90% of clonal myeloproliferative cases. Orv. Hetil., 2014, 155(52), 2074-2081

    Distinct clinical characteristics of myeloproliferative neoplasms with calreticulin mutations

    Get PDF
    Somatic insertions/deletions in the calreticulin gene have recently been discovered to be causative alterations in myeloproliferative neoplasms. A combination of qualitative and quantitative allele-specific polymerase chain reaction, fragment-sizing, high resolution melting and Sanger-sequencing was applied for the detection of three driver mutations (in Janus kinase 2, calreticulin and myeloproliferative leukemia virus oncogene genes) in 289 cases of essential thrombocythemia and 99 cases of primary myelofibrosis. In essential thrombocythemia, 154 (53%) Janus kinase 2 V617F, 96 (33%) calreticulin, 9 (3%) myeloproliferative leukemia virus oncogene gene mutation-positive and 30 triple-negative (11%) cases were identified, while in primary myelofibrosis 56 (57%) Janus kinase 2 V617F, 25 (25%) calreticulin, 7 (7%) myeloproliferative leukemia virus oncogene gene mutation-positive and 11 (11%) triple-negative cases were identified. Patients positive for the calreticulin mutation were younger and had higher platelet counts compared to Janus kinase 2 mutation-positive counterparts. Calreticulin mutation-positive patients with essential thrombocythemia showed a lower risk of developing venous thrombosis, but no difference in overall survival. Calreticulin mutation-positive patients with primary myelofibrosis had a better overall survival compared to that of the Janus kinase 2 mutation-positive (P=0.04) or triple-negative cases (P=0.01). Type 2 calreticulin mutation occurred more frequently in essential thrombocythemia than in primary myelofibrosis (P=0.049). In essential thrombocythemia, the calreticulin mutational load was higher than the Janus kinase 2 mutational load (P<0.001), and increased gradually in advanced stages. Calreticulin mutational load influenced blood counts even at the time point of diagnosis in essential thrombocythemia. We confirm that calreticulin mutation is associated with distinct clinical characteristics and explored relationships between mutation type, load and clinical outcome

    Centre selection for clinical trials and the generalisability of results: a mixed methods study.

    Get PDF
    BACKGROUND: The rationale for centre selection in randomised controlled trials (RCTs) is often unclear but may have important implications for the generalisability of trial results. The aims of this study were to evaluate the factors which currently influence centre selection in RCTs and consider how generalisability considerations inform current and optimal practice. METHODS AND FINDINGS: Mixed methods approach consisting of a systematic review and meta-summary of centre selection criteria reported in RCT protocols funded by the UK National Institute of Health Research (NIHR) initiated between January 2005-January 2012; and an online survey on the topic of current and optimal centre selection, distributed to professionals in the 48 UK Clinical Trials Units and 10 NIHR Research Design Services. The survey design was informed by the systematic review and by two focus groups conducted with trialists at the Birmingham Centre for Clinical Trials. 129 trial protocols were included in the systematic review, with a total target sample size in excess of 317,000 participants. The meta-summary identified 53 unique centre selection criteria. 78 protocols (60%) provided at least one criterion for centre selection, but only 31 (24%) protocols explicitly acknowledged generalisability. This is consistent with the survey findings (n = 70), where less than a third of participants reported generalisability as a key driver of centre selection in current practice. This contrasts with trialists' views on optimal practice, where generalisability in terms of clinical practice, population characteristics and economic results were prime considerations for 60% (n = 42), 57% (n = 40) and 46% (n = 32) of respondents, respectively. CONCLUSIONS: Centres are rarely enrolled in RCTs with an explicit view to external validity, although trialists acknowledge that incorporating generalisability in centre selection should ideally be more prominent. There is a need to operationalize 'generalisability' and incorporate it at the design stage of RCTs so that results are readily transferable to 'real world' practice

    Decrease in treatment intensity predicts worse outcome in patients with locally advanced head and neck squamous cell carcinoma undergoing radiochemotherapy

    Get PDF
    PURPOSE: Radiochemotherapy (RCT) is an effective standard therapy for locally advanced head and neck squamous cell carcinoma (LA-HNSCC). Nonetheless, toxicity is common, with patients often requiring dose modifications. METHODS: To investigate associations of RCT toxicities according to CTCAE version 5.0 and subsequent therapy modifications with short- and long-term treatment outcomes, we studied all 193 patients with HNSCC who received RCT (70 Gy + platinum agent) at an academic center between 03/2010 and 04/2018. RESULTS: During RCT, 77 (41%, 95% CI 34-49) patients developed at least one ≥ grade 3 toxicity, including seven grade 4 and 3 fatal grade 5 toxicities. The most frequent any-grade toxicities were xerostomia (n = 187), stomatitis (n = 181), dermatitis (n = 174), and leucopenia (n = 98). Eleven patients (6%) had their radiotherapy schedule modified (mean radiotherapy dose reduction = 12 Gy), and 120 patients (64%) had chemotherapy modifications (permanent discontinuation: n = 67, pause: n = 34, dose reduction: n = 7, change to other chemotherapy: n = 10). Objective response rates to RCT were 55% and 88% in patients with and without radiotherapy modifications (p = 0.003), and 84% and 88% in patients with and without chemotherapy modifications (p = 0.468), respectively. Five-year progression-free survival estimates were 20% and 50% in patients with and without radiotherapy modifications (p = < 0.001), and 53% and 40% in patients with and without chemotherapy modifications (p = 0.88), respectively. CONCLUSIONS: Reductions of radiotherapy dose were associated with impaired long-term outcomes, whereas reductions in chemotherapy intensity were not. This suggests that toxicities during RCT should be primarily managed by modifying chemotherapy rather than radiotherapy

    The fate of Böhler's angle in conservatively-treated displaced intra-articular calcaneal fractures

    Get PDF
    Purpose: Although the predictive value of Böhler's angle on outcome remains subject of debate, the initial angle at the time of trauma still guides treatment. Changes in Böhler's angle during follow-up are frequently reported following surgical treatment of displaced intra-articular calcaneal fractures (DIACF). The aim of the present study was to determine the changes in Böhler's angle as a measure of secondary fracture displacement following conservative management of DIACF. Methods: Thirty-eight patients with a total of 44 displaced intra-articular calcaneal fractures treated conservatively with a minimum of two lateral radiographs during follow-up were analysed. Böhler's angle at different follow-up times was measured by three observers. The change in angle was compared with the angle at trauma, and influence of trauma mechanism and common calcaneal fracture classifications were determined. Results: The results showed a significant decline over time of the Böhler's angle in conservatively-treated patients of more than 11° on average at a mean follow-up of 29.2 weeks. This decrease was not related to gender, the initial angle, or the Essex-Lopresti or Sanders classification. A statistically significantly higher decrease was detected in high energetic trauma compared with low energetic trauma. Conclusion: The conservative treatment of displaced intra-articular calcaneal fractures is still a viable option, yet a significant secondary displacement in time should be taken into account, as reflected in a decrease of Böhler's angle of 11° up to one year following trauma

    Weekend admission to hospital has a higher risk of death in the elective setting than in the emergency setting: a retrospective database study of national health service hospitals in England

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Although acute hospitals offer a twenty-four hour seven day a week service levels of staffing are lower over the weekends and some health care processes may be less readily available over the weekend. Whilst it is thought that emergency admission to hospital on the weekend is associated with an increased risk of death, the extent to which this applies to elective admissions is less well known. We investigated the risk of death in elective and elective patients admitted over the weekend versus the weekdays.</p> <p>Methods</p> <p>Retrospective statistical analysis of routinely collected acute hospital admissions in England, involving all patient discharges from all acute hospitals in England over a year (April 2008-March 2009), using a logistic regression model which adjusted for a range of patient case-mix variables, seasonality and admission over a weekend separately for elective and emergency (but excluding zero day stay emergency admissions discharged alive) admissions.</p> <p>Results</p> <p>Of the 1,535,267 elective admissions, 91.7% (1,407,705) were admitted on the weekday and 8.3% (127,562) were admitted on the weekend. The mortality following weekday admission was 0.52% (7,276/1,407,705) compared with 0.77% (986/127,562) following weekend admission. Of the 3,105,249 emergency admissions, 76.3% (2,369,316) were admitted on the weekday and 23.7% (735,933) were admitted on the weekend. The mortality following emergency weekday admission was 6.53% (154,761/2,369,316) compared to 7.06% (51,922/735,933) following weekend admission. After case-mix adjustment, weekend admissions were associated with an increased risk of death, especially in the elective setting (elective Odds Ratio: 1.32, 95% Confidence Interval 1.23 to 1.41); vs emergency Odds Ratio: 1.09, 95% Confidence Interval 1.05 to 1.13).</p> <p>Conclusions</p> <p>Weekend admission appears to be an independent risk factor for dying in hospital and this risk is more pronounced in the elective setting. Given the planned nature of elective admissions, as opposed to the unplanned nature of emergency admissions, it would seem less likely that this increased risk in the elective setting is attributable to unobserved patient risk factors. Further work to understand the relationship between weekend processes of care and mortality, especially in the elective setting, is required.</p

    The homocysteine controversy

    Get PDF
    Mild to moderate hyperhomocysteinemia has been identified as a strong predictor of cardiovascular disease, independent from classical atherothrombotic risk factors. In the last decade, a number of large intervention trials using B vitamins have been performed and have shown no benefit of homocysteine-lowering therapy in high-risk patients. In addition, Mendelian randomization studies failed to convincingly demonstrate that a genetic polymorphism commonly associated with higher homocysteine levels (methylenetetrahydrofolate reductase 677 C>T) is a risk factor for cardiovascular disease. Together, these findings have cast doubt on the role of homocysteine in cardiovascular disease pathogenesis, and the homocysteine hypothesis has turned into a homocysteine controversy. In this review, we attempt to find solutions to this controversy. First, we explain that the Mendelian randomization analyses have limitations that preclude final conclusions. Second, several characteristics of intervention trials limit interpretation and generalizability of their results. Finally, the possibility that homocysteine lowering is in itself beneficial but is offset by adverse side effects of B vitamins on atherosclerosis deserves serious attention. As we explain, such side effects may relate to direct adverse effects of the B-vitamin regimen (in particular, the use of high-dose folic acid) or to proinflammatory and proproliferative effects of B vitamins on advanced atherosclerotic lesions
    • …
    corecore