618 research outputs found

    Root morphology and seed and leaf ionomic traits in a Brassica napus L. diversity panel show wide phenotypic variation and are characteristic of crop habit

    Get PDF
    Background: Mineral nutrient uptake and utilisation by plants are controlled by many traits relating to root morphology, ion transport, sequestration and translocation. The aims of this study were to determine the phenotypic diversity in root morphology and leaf and seed mineral composition of a polyploid crop species, Brassica napus L., and how these traits relate to crop habit. Traits were quantified in a diversity panel of up to 387 genotypes: 163 winter, 127 spring, and seven semiwinter oilseed rape (OSR) habits, 35 swede, 15 winter fodder, and 40 exotic/unspecified habits. Root traits of 14 d old seedlings were measured in a ‘pouch and wick’ system (n = ~24 replicates per genotype). The mineral composition of 3–6 rosette-stage leaves, and mature seeds, was determined on compost-grown plants from a designed experiment (n = 5) by inductively coupled plasma-mass spectrometry (ICP-MS). Results: Seed size explained a large proportion of the variation in root length. Winter OSR and fodder habits had longer primary and lateral roots than spring OSR habits, with generally lower mineral concentrations. A comparison of the ratios of elements in leaf and seed parts revealed differences in translocation processes between crop habits, including those likely to be associated with crop-selection for OSR seeds with lower sulphur-containing glucosinolates. Combining root, leaf and seed traits in a discriminant analysis provided the most accurate characterisation of crop habit, illustrating the interdependence of plant tissues. Conclusions: High-throughput morphological and composition phenotyping reveals complex interrelationships between mineral acquisition and accumulation linked to genetic control within and between crop types (habits) in B. napus. Despite its recent genetic ancestry (<10 ky), root morphology, and leaf and seed composition traits could potentially be used in crop improvement, if suitable markers can be identified and if these correspond with suitable agronomy and quality traits

    Wheat-barley hybridization – the last forty years

    Get PDF
    Abstract Several useful alien gene transfers have been reported from related species into wheat (Triticum aestivum), but very few publications have dealt with the development of wheat/barley (Hordeum vulgare) introgression lines. An overview is given here of wheat 9 barley hybridization over the last forty years, including the development of wheat 9 barley hybrids, and of addition and translocation lines with various barley cultivars. A short summary is also given of the wheat 9 barley hybrids produced with other Hordeum species. The meiotic pairing behaviour of wheat 9 barley hybrids is presented, with special regard to the detection of wheat– barley homoeologous pairing using the molecular cytogenetic technique GISH. The effect of in vitro multiplication on the genome composition of intergeneric hybrids is discussed, and the production and characterization of the latest wheat/barley translocation lines are presented. An overview of the agronomical traits (b-glucan content, earliness, salt tolerance, sprouting resistance, etc.) of the newly developed introgression lines is given. The exploitation and possible use of wheat/barley introgression lines for the most up-to-date molecular genetic studies (transcriptome analysis, sequencing of flow-sorted chromosomes) are also discussed

    Chemotherapy-Induced Amenorrhea in Patients With Breast Cancer With a BRCA1 or BRCA2 Mutation

    Get PDF
    PURPOSE: To determine the likelihood of long-term amenorrhea after treatment with chemotherapy in women with breast cancer who carry a BRCA1 or BRCA2 mutation. PATIENTS AND METHODS: We conducted a multicenter survey of 1,954 young women with a BRCA1 or BRCA2 mutation who were treated for breast cancer. We included premenopausal women who were diagnosed with invasive breast cancer between 26 and 47 years of age. We determined the age of onset of amenorrhea after breast cancer for women who were and were not treated with chemotherapy, alone or with tamoxifen. We considered chemotherapy-induced amenorrhea to have occurred when the patient experienced ≥ 2 years of amenorrhea, commencing within 2 years of initiating chemotherapy, with no resumption of menses. RESULTS: Of the 1,426 women who received chemotherapy, 35% experienced long-term amenorrhea. Of the 528 women who did not receive chemotherapy, 5.3% developed long-term amenorrhea. The probabilities of chemotherapy-induced amenorrhea were 7.2% for women diagnosed before age 30 years, 33% for women age 31 to 44 years, and 79% for women diagnosed after age 45 years (P trend < .001). The probability of induced amenorrhea was higher for women who received tamoxifen than for those who did not (52% v 29%; P < .001). CONCLUSION: Age at treatment and use of tamoxifen are important predictors of chemotherapy-induced amenorrhea in women who carry a BRCA1 or BRCA2 mutation. The risk of induced long-term amenorrhea does not seem to be greater among mutation carriers than among women who do not carry a mutation.published_or_final_versio

    In-depth investigation of the molecular pathogenesis of bladder cancer in a unique 26-year old patient with extensive multifocal disease: A case report

    Get PDF
    Background. The molecular characteristics and the clinical disease course of bladder cancer (BC) in young patients remain largely unresolved. All patients are monitored according to an intensive surveillance protocol and we aim to gain more insight into the molecular pathways of bladder tumors in young patients that could ultimately contribute to patient stratification, improve patient quality of life and reduce associated costs. We also determined whether a biomarker-based surveillance could be feasible. Case Presentation. We report a unique case of a 26-year-old Caucasian male with recurrent non-muscle invasive bladder tumors occurring at a high frequency and analyzed multiple tumors (maximal pTaG2) and urine samples of this patient. Analysis included FGFR3 mutation detection, FGFR3 and TP53 immunohistochemistry, mircosatellite analysis of markers on chromosomes 8, 9, 10, 11 and 17 and a genome wide single nucleotide polymorphism-array (SNP). All analyzed tumors contained a mutation in FGFR3 and were associated with FGFR3 overexpression. None of the tumors showed overexpression of TP53. We found a deletion on chromosome 9 in the primary tumor and this was confirmed by the SNP-array that showed regions of loss on chromosome 9. Detection of all recurrences was possible by urinary FGFR3 mutation analysis. Conclusions. Our findings would suggest that the BC disease course is determined by not only a patient's age, but also by the molecular characteristics of a tumor. This young patient contained typical genetic changes found in tumors of older patients and implies a clinical disease course comparable to older patients. We demonstrate that FGFR3 mutation analysis on voided urine is a simple non-invasive method and could serve as a feasible follow-up approach for this young patient presenting with an FGFR3 mutant tumor

    Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy

    Get PDF
    Background A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets. Methods Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis. Results A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001). Conclusion We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty

    Shoulder muscle endurance: the development of a standardized and reliable protocol

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Shoulder muscle fatigue has been proposed as a possible link to explain the association between repetitive arm use and the development of rotator cuff disorders. To our knowledge, no standardized clinical endurance protocol has been developed to evaluate the effects of muscle fatigue on shoulder function. Such a test could improve clinical examination of individuals with shoulder disorders. Therefore, the purpose of this study was to establish a reliable protocol for objective assessment of shoulder muscle endurance.</p> <p>Methods</p> <p>An endurance protocol was developed on a stationary dynamometer (Biodex System 3). The endurance protocol was performed in isotonic mode with the resistance set at 50% of each subject's peak torque as measured for shoulder external (ER) and internal rotation (IR). Each subject performed 60 continuous repetitions of IR/ER rotation. The endurance protocol was performed by 36 healthy individuals on two separate occasions at least two days apart. Maximal isometric shoulder strength tests were performed before and after the fatigue protocol to evaluate the effects of the endurance protocol and its reliability. Paired <it>t</it>-tests were used to evaluate the reduction in shoulder strength due to the protocol, while intraclass correlation coefficients (ICC) and minimal detectable change (MDC) were used to evaluate its reliability.</p> <p>Results</p> <p>Maximal isometric strength was significantly decreased after the endurance protocol (<it>P </it>< 0.001). The total work performed during the last third of the protocol was significantly less than the first third of the protocol (P < 0.05). The test-retest reliability of the post-fatigue strength measures was excellent (ICC >0.84).</p> <p>Conclusions</p> <p>Changes in muscular performance observed during and after the muscular endurance protocol suggests that the protocol did result in muscular fatigue. Furthermore, this study established that the resultant effects of fatigue of the proposed isotonic protocol were reproducible over time. The protocol was performed without difficulty by all volunteers and took less than 10 minutes to perform, suggesting that it might be feasible for clinical practice. This protocol could be used to induce local muscular fatigue in order to evaluate the effects of fatigue on shoulder kinematics or to evaluate changes in shoulder muscle endurance following rehabilitation.</p

    Cancer risk management strategies and perceptions of unaffected women 5 years after predictive genetic testing for BRCA1/2 mutations

    Get PDF
    In a French national cohort of unaffected females carriers/non-carriers of a BRCA1/2 mutation, long-term preventive strategies and breast/ovarian cancer risk perceptions were followed up to 5 years after test result disclosure, using self-administered questionnaires. Response rate was 74%. Carriers (N=101) were younger (average age±SD=37±10) than non-carriers (N=145; 42±12). There were four management strategies that comprised 88% of the decisions made by the unaffected carriers: 50% opted for breast surveillance alone, based on either magnetic resonance imaging (MRI) and other imaging (31%) or mammography alone (19%); 38% opted for either risk reducing salpingo-oophorectomy (RRSO) and breast surveillance, based on MRI and other imaging (28%) or mammography alone (10%). The other three strategies were: risk reducing mastectomy (RRM) and RRSO (5%), RRM alone (2%) and neither RRM/RRSO nor surveillance (6%). The results obtained for various age groups are presented here. Non-carriers often opted for screening despite their low cancer risk. Result disclosure increased carriers' short-term high breast/ovarian cancer risk perceptions (P⩽0.02) and decreased non-carriers' short- and long-term perceptions (P<0.001). During follow-up, high breast cancer risk perceptions increased with time among those who had no RRM and decreased in the opposite case; high ovarian cancer risk perceptions increased further with time among those who had no RRSO and decreased in the opposite case; RRSO did not affect breast cancer risk perceptions. Informed decision-making involves letting women know whether opting for RRSO and breast MRI surveillance is as effective in terms of survival as RRM and RRSO

    Practice change in chronic conditions care: an appraisal of theories

    Get PDF
    Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.Background Management of chronic conditions can be complex and burdensome for patients and complex and costly for health systems. Outcomes could be improved and costs reduced if proven clinical interventions were better implemented, but the complexity of chronic care services appears to make clinical change particularly challenging. Explicit use of theories may improve the success of clinical change in this area of care provision. Whilst theories to support implementation of practice change are apparent in the broad healthcare arena, the most applicable theories for the complexities of practice change in chronic care have not yet been identified. Methods We developed criteria to review the usefulness of change implementation theories for informing chronic care management and applied them to an existing list of theories used more widely in healthcare. Results Criteria related to the following characteristics of chronic care: breadth of the field; multi-disciplinarity; micro, meso and macro program levels; need for field-specific research on implementation requirements; and need for measurement. Six theories met the criteria to the greatest extent: the Consolidate Framework for Implementation Research; Normalization Process Theory and its extension General Theory of Implementation; two versions of the Promoting Action on Research Implementation in Health Services framework and Sticky Knowledge. None fully met all criteria. Involvement of several care provision organizations and groups, involvement of patients and carers, and policy level change are not well covered by most theories. However, adaptation may be possible to include multiple groups including patients and carers, and separate theories may be needed on policy change. Ways of qualitatively assessing theory constructs are available but quantitative measures are currently partial and under development for all theories. Conclusions Theoretical bases are available to structure clinical change research in chronic condition care. Theories will however need to be adapted and supplemented to account for the particular features of care in this field, particularly in relation to involvement of multiple organizations and groups, including patients, and in relation to policy influence. Quantitative measurement of theory constructs may present difficulties

    Search for the standard model Higgs boson at LEP

    Get PDF
    corecore