25 research outputs found
The impact of herbicide tolerant oilseed rape in selected agro-ecosystems
A range of field studies of cross pollination between herbicide tolerant and conventional oilseed rape crops and plots were conducted to demonstrate the effects of variety, distance, pollen source and sink size and intervening crop on levels of outcrossing. Experiments investigating the influence of variety on outcrossing showed that hybrid oilseed rape varieties containing high proportions of male sterile plants (varietal associations) were pollinated at higher frequencies than standard fully fertile varieties. Studies conducted using various sizes of genetically modified herbicide tolerant (GMHT) pollen sources showed that small GMHT feral populations cross pollinated with crops in close proximity and the levels of contamination obtained depended on the genotype of the conventional crop. Long range cross pollination of male sterile oilseed rape receptor plots showed that pollination events were measurable at up to 600m from the large GMHT pollen source. Results from a study of cross pollination between mixed populations of GMHT oilseed rape plants and conventional varieties also demonstrated that a hybrid rape type (varietal association) was cross pollinated at considerably higher levels than an open pollinated and a fully restored hybrid variety. Evidence gathered in the experiment suggested that, over a wide range of initial GMHT contamination rates, the final proportion of GMHT seed in the total population was a constant fraction of the initial contamination rate. Outcrossing data was used to compare negative exponential and inverse power law models for their fit to describe the observed relationship between cross pollination and distance from source. Results showed that the inverse power law provided a better fit of the data. This demonstrated that dispersal described by the inverse power law was more likely to lead to cross pollination at both near and large distances from the pollen source compared to the negative exponential model. The consequences of the likely ecological behaviour of GMHT traits resulting from the dispersal curves for regulation and risk assessment are discussed. The effect of the herbicides used in herbicide tolerant and conventional oilseed rape on weed populations were compared in a single season. Results suggested that the herbicides have different activity spectra thus resulting in a variety of surviving weed species in HT treatments. The change in active ingredient and the timing of herbicide application in HT winter oilseed rape crops will likely cause a change in the weed species that are being controlled or those that escape treatment. Levels of weed biomass recorded prior to harvest of the oilseed rape crop showed that there may be differences between treatments in terms of the quantity of seed returned to the seedbank from the range of weed species present. Limited data on the behaviour of herbicide tolerant volunteers showed that single and putative double-tolerant plants were as susceptible as conventional oilseed rape volunteers to normal selective herbicides used in cereal crops. Data from a number of elements of the studies on GM contamination rates, weed control, and seed bank estimates were used to develop a simple population projection model. The model used a Markov process to examine the fate of volunteer and feral populations of oilseed rape comprising a mixture of conventional and herbicide tolerant types. Results from the model indicated that the prevalence of the GMHT trait in the weed or feral population was more sensitive to the efficacy of control practices used in the rotation than the levels of cross-pollination and competition between the herbicide tolerant and conventional varieties in the mixed population. Thus, although the cross-pollination studies suggested that varietal associations are more likely to be cross-pollinated by GM pollen than fully fertile varieties, the projection model suggested that resulting differences in the prevalence of the GM trait in volunteer and feral populations may not be very large. Population projections from the model are compared with results from other modelling studies which have used more complex simulation approaches
Rasch scaling procedures for informing development of a valid Fetal Surveillance Education Program multiple-choice assessment
<p>Abstract</p> <p>Background</p> <p>It is widely recognised that deficiencies in fetal surveillance practice continue to contribute significantly to the burden of adverse outcomes. This has prompted the development of evidence-based clinical practice guidelines by the Royal Australian and New Zealand College of Obstetricians and Gynaecologists and an associated Fetal Surveillance Education Program to deliver the associated learning. This article describes initial steps in the validation of a corresponding multiple-choice assessment of the relevant educational outcomes through a combination of item response modelling and expert judgement.</p> <p>Methods</p> <p>The Rasch item response model was employed for item and test analysis and to empirically derive the substantive interpretation of the assessment variable. This interpretation was then compared to the hierarchy of competencies specified a priori by a team of eight subject-matter experts. Classical Test Theory analyses were also conducted.</p> <p>Results</p> <p>A high level of agreement between the hypothesised and derived variable provided evidence of construct validity. Item and test indices from Rasch analysis and Classical Test Theory analysis suggested that the current test form was of moderate quality. However, the analyses made clear the required steps for establishing a valid assessment of sufficient psychometric quality. These steps included: increasing the number of items from 40 to 50 in the first instance, reviewing ineffective items, targeting new items to specific content and difficulty gaps, and formalising the assessment blueprint in light of empirical information relating item structure to item difficulty.</p> <p>Conclusion</p> <p>The application of the Rasch model for criterion-referenced assessment validation with an expert stakeholder group is herein described. Recommendations for subsequent item and test construction are also outlined in this article.</p
Targeting DNA Damage Response and Replication Stress in Pancreatic Cancer
Background and aims:
Continuing recalcitrance to therapy cements pancreatic cancer (PC) as the most lethal malignancy, which is set to become the second leading cause of cancer death in our society. The study aim was to investigate the association between DNA damage response (DDR), replication stress and novel therapeutic response in PC to develop a biomarker driven therapeutic strategy targeting DDR and replication stress in PC.
Methods:
We interrogated the transcriptome, genome, proteome and functional characteristics of 61 novel PC patient-derived cell lines to define novel therapeutic strategies targeting DDR and replication stress. Validation was done in patient derived xenografts and human PC organoids.
Results:
Patient-derived cell lines faithfully recapitulate the epithelial component of pancreatic tumors including previously described molecular subtypes. Biomarkers of DDR deficiency, including a novel signature of homologous recombination deficiency, co-segregates with response to platinum (P < 0.001) and PARP inhibitor therapy (P < 0.001) in vitro and in vivo. We generated a novel signature of replication stress with which predicts response to ATR (P < 0.018) and WEE1 inhibitor (P < 0.029) treatment in both cell lines and human PC organoids. Replication stress was enriched in the squamous subtype of PC (P < 0.001) but not associated with DDR deficiency.
Conclusions:
Replication stress and DDR deficiency are independent of each other, creating opportunities for therapy in DDR proficient PC, and post-platinum therapy
Recommended from our members
Effect of Hydrocortisone on Mortality and Organ Support in Patients With Severe COVID-19: The REMAP-CAP COVID-19 Corticosteroid Domain Randomized Clinical Trial.
Importance: Evidence regarding corticosteroid use for severe coronavirus disease 2019 (COVID-19) is limited. Objective: To determine whether hydrocortisone improves outcome for patients with severe COVID-19. Design, Setting, and Participants: An ongoing adaptive platform trial testing multiple interventions within multiple therapeutic domains, for example, antiviral agents, corticosteroids, or immunoglobulin. Between March 9 and June 17, 2020, 614 adult patients with suspected or confirmed COVID-19 were enrolled and randomized within at least 1 domain following admission to an intensive care unit (ICU) for respiratory or cardiovascular organ support at 121 sites in 8 countries. Of these, 403 were randomized to open-label interventions within the corticosteroid domain. The domain was halted after results from another trial were released. Follow-up ended August 12, 2020. Interventions: The corticosteroid domain randomized participants to a fixed 7-day course of intravenous hydrocortisone (50 mg or 100 mg every 6 hours) (n = 143), a shock-dependent course (50 mg every 6 hours when shock was clinically evident) (n = 152), or no hydrocortisone (n = 108). Main Outcomes and Measures: The primary end point was organ support-free days (days alive and free of ICU-based respiratory or cardiovascular support) within 21 days, where patients who died were assigned -1 day. The primary analysis was a bayesian cumulative logistic model that included all patients enrolled with severe COVID-19, adjusting for age, sex, site, region, time, assignment to interventions within other domains, and domain and intervention eligibility. Superiority was defined as the posterior probability of an odds ratio greater than 1 (threshold for trial conclusion of superiority >99%). Results: After excluding 19 participants who withdrew consent, there were 384 patients (mean age, 60 years; 29% female) randomized to the fixed-dose (n = 137), shock-dependent (n = 146), and no (n = 101) hydrocortisone groups; 379 (99%) completed the study and were included in the analysis. The mean age for the 3 groups ranged between 59.5 and 60.4 years; most patients were male (range, 70.6%-71.5%); mean body mass index ranged between 29.7 and 30.9; and patients receiving mechanical ventilation ranged between 50.0% and 63.5%. For the fixed-dose, shock-dependent, and no hydrocortisone groups, respectively, the median organ support-free days were 0 (IQR, -1 to 15), 0 (IQR, -1 to 13), and 0 (-1 to 11) days (composed of 30%, 26%, and 33% mortality rates and 11.5, 9.5, and 6 median organ support-free days among survivors). The median adjusted odds ratio and bayesian probability of superiority were 1.43 (95% credible interval, 0.91-2.27) and 93% for fixed-dose hydrocortisone, respectively, and were 1.22 (95% credible interval, 0.76-1.94) and 80% for shock-dependent hydrocortisone compared with no hydrocortisone. Serious adverse events were reported in 4 (3%), 5 (3%), and 1 (1%) patients in the fixed-dose, shock-dependent, and no hydrocortisone groups, respectively. Conclusions and Relevance: Among patients with severe COVID-19, treatment with a 7-day fixed-dose course of hydrocortisone or shock-dependent dosing of hydrocortisone, compared with no hydrocortisone, resulted in 93% and 80% probabilities of superiority with regard to the odds of improvement in organ support-free days within 21 days. However, the trial was stopped early and no treatment strategy met prespecified criteria for statistical superiority, precluding definitive conclusions. Trial Registration: ClinicalTrials.gov Identifier: NCT02735707
Recommended from our members
Genome-wide association meta-analysis of corneal curvature identifies novel loci and shared genetic influences across axial length and refractive error
Abstract: Corneal curvature, a highly heritable trait, is a key clinical endophenotype for myopia - a major cause of visual impairment and blindness in the world. Here we present a trans-ethnic meta-analysis of corneal curvature GWAS in 44,042 individuals of Caucasian and Asian with replication in 88,218 UK Biobank data. We identified 47 loci (of which 26 are novel), with population-specific signals as well as shared signals across ethnicities. Some identified variants showed precise scaling in corneal curvature and eye elongation (i.e. axial length) to maintain eyes in emmetropia (i.e. HDAC11/FBLN2 rs2630445, RBP3 rs11204213); others exhibited association with myopia with little pleiotropic effects on eye elongation. Implicated genes are involved in extracellular matrix organization, developmental process for body and eye, connective tissue cartilage and glycosylation protein activities. Our study provides insights into population-specific novel genes for corneal curvature, and their pleiotropic effect in regulating eye size or conferring susceptibility to myopia
Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19
IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19.
Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19.
DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022).
INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days.
MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes.
RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively).
CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes.
TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570
Stable, Covalent Attachment of Laminin to Microposts Improves the Contractility of Mouse Neonatal Cardiomyocytes
[Image: see text] The mechanical output of contracting cardiomyocytes, the muscle cells of the heart, relates to healthy and disease states of the heart. Culturing cardiomyocytes on arrays of elastomeric microposts can enable inexpensive and high-throughput studies of heart disease at the single-cell level. However, cardiomyocytes weakly adhere to these microposts, which limits the possibility of using biomechanical assays of single cardiomyocytes to study heart disease. We hypothesized that a stable covalent attachment of laminin to the surface of microposts improves cardiomyocyte contractility. We cultured cells on polydimethylsiloxane microposts with laminin covalently bonded with the organosilanes 3-glycidoxypropyltrimethoxysilane and 3-aminopropyltriethoxysilane with glutaraldehyde. We measured displacement of microposts induced by the contractility of mouse neonatal cardiomyocytes, which attach better than mature cardiomyocytes to substrates. We observed time-dependent changes in contractile parameters such as micropost deformation, contractility rates, contraction and relaxation speeds, and the times of contractions. These parameters were affected by the density of laminin on microposts and by the stability of laminin binding to micropost surfaces. Organosilane-mediated binding resulted in higher laminin surface density and laminin binding stability. 3-glycidoxypropyltrimethoxysilane provided the highest laminin density but did not provide stable protein binding with time. Higher surface protein binding stability and strength were observed with 3-aminopropyltriethoxysilane with glutaraldehyde. In cultured cardiomyocytes, contractility rate, contraction speeds, and contraction time increased with higher laminin stability. Given these variations in contractile function, we conclude that binding of laminin to microposts via 3-aminopropyltriethoxysilane with glutaraldehyde improves contractility observed by an increase in beating rate and contraction speed as it occurs during the postnatal maturation of cardiomyocytes. This approach is promising for future studies to mimic in vivo tissue environments
Recommended from our members
SARS-CoV-2 RNAemia predicts clinical deterioration and extrapulmonary complications from COVID-19
BackgroundThe determinants of coronavirus disease 2019 (COVID-19) disease severity and extrapulmonary complications (EPCs) are poorly understood. We characterized relationships between severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) RNAemia and disease severity, clinical deterioration, and specific EPCs.MethodsWe used quantitative and digital polymerase chain reaction (qPCR and dPCR) to quantify SARS-CoV-2 RNA from plasma in 191 patients presenting to the emergency department with COVID-19. We recorded patient symptoms, laboratory markers, and clinical outcomes, with a focus on oxygen requirements over time. We collected longitudinal plasma samples from a subset of patients. We characterized the role of RNAemia in predicting clinical severity and EPCs using elastic net regression.ResultsOf SARS-CoV-2-positive patients, 23.0% (44 of 191) had viral RNA detected in plasma by dPCR, compared with 1.4% (2 of 147) by qPCR. Most patients with serial measurements had undetectable RNAemia within 10 days of symptom onset, reached maximum clinical severity within 16 days, and symptom resolution within 33 days. Initially RNAemic patients were more likely to manifest severe disease (odds ratio, 6.72 [95% confidence interval, 2.45-19.79]), worsening of disease severity (2.43 [1.07-5.38]), and EPCs (2.81 [1.26-6.36]). RNA loads were correlated with maximum severity (r = 0.47 [95% confidence interval, .20-.67]).ConclusionsdPCR is more sensitive than qPCR for the detection of SARS-CoV-2 RNAemia, which is a robust predictor of eventual COVID-19 severity and oxygen requirements, as well as EPCs. Because many COVID-19 therapies are initiated on the basis of oxygen requirements, RNAemia on presentation might serve to direct early initiation of appropriate therapies for the patients most likely to deteriorate