1,042 research outputs found
Spontaneous dehydrocoupling in peri-substituted phosphine-borane adducts
This work was financially supported by the EPSRC and COST action CM1302 SIPs.Bis(borane) adducts Acenap(PiPr2·BH3)(PRH·BH3) (Acenap = acenaphthene-5,6-diyl; 4a, R = Ph; 4b, R = ferrocenyl, Fc; 4c, R = H) were synthesised by the reaction of excess H3B·SMe2 with either phosphino-phosphonium salts [Acenap(PiPr2)(PR)]+Cl− (1a, R = Ph; 1b, R = Fc), or bis(phosphine) Acenap(PiPr2)(PH2) (3). Bis(borane) adducts 4a–c were found to undergo dihydrogen elimination at room temperature, this spontaneous catalyst-free phosphine-borane dehydrocoupling yields BH2 bridged species Acenap(PiPr2)(µ-BH2)(PR·BH3) (5a, R = Ph; 5b, R = Fc; 5c, R = H). Thermolysis of 5c results in loss of the terminal borane moiety to afford Acenap(PiPr2)(µ-BH2)(PH) (14). Single crystal X-ray structures of 3, 4b and 5a–c are reported.Publisher PDFPeer reviewe
Effects of damping-off caused by Rhizoctonia solani anastomosis group 2-1 on roots of wheat and oil seed rape quantified using X-ray computed tomography and real-time PCR
Rhizoctonia solani is a plant pathogenic fungus that causes significant establishment and yield losses to several important food crops globally. This is the first application of high resolution X-ray micro Computed Tomography (X-ray μCT) and real-time PCR to study host–pathogen interactions in situ and elucidate the mechanism of Rhizoctonia damping-off disease over a 6-day period caused by R. solani, anastomosis group (AG) 2-1 in wheat (Triticum aestivum cv. Gallant) and oil seed rape (OSR, Brassica napus cv. Marinka). Temporal, non-destructive analysis of root system architectures was performed using RooTrak and validated by the destructive method of root washing. Disease was assessed visually and related to pathogen DNA quantification in soil using real-time PCR. R. solani AG2-1 at similar initial DNA concentrations in soil was capable of causing significant damage to the developing root systems of both wheat and OSR. Disease caused reductions in primary root number, root volume, root surface area, and convex hull which were affected less in the monocotyledonous host. Wheat was more tolerant to the pathogen, exhibited fewer symptoms and developed more complex root systems. In contrast, R. solani caused earlier damage and maceration of the taproot of the dicot, OSR. Disease severity was related to pathogen DNA accumulation in soil only for OSR, however, reductions in root traits were significantly associated with both disease and pathogen DNA. The method offers the first steps in advancing current understanding of soil-borne pathogen behavior in situ at the pore scale, which may lead to the development of mitigation measures to combat disease influence in the field
Time trends in service provision and survival outcomes for patients with renal cancer treated by nephrectomy in England 2000-2010.
OBJECTIVE: To describe the temporal trends in nephrectomy practice and outcomes for English patients with renal cell carcinoma (RCC). PATIENTS AND METHODS: Adult RCC nephrectomy patients treated between 2000 and 2010 were identified in the National Cancer Data Repository and Hospital Episode Statistics, and followed-up until date of death or 31 December 2015 (n = 30 763). We estimated the annual frequency for each nephrectomy type, the hospital and surgeon numbers and their case volumes. We analysed short-term surgical outcomes, as well as 1- and 5-year relative survivals. RESULTS: Annual RCC nephrectomy number increased by 66% during the study period. Hospital number decreased by 24%, whilst the median annual hospital volume increased from 10 to 23 (P < 0.01). Surgeon number increased by 27% (P < 0.01), doubling the median consultant number per hospital. The proportion of minimally invasive surgery (MIS) nephrectomies rose from 1% to 46%, whilst the proportion of nephron-sparing surgeries (NSS) increased from 5% to 16%, with 29% of all T1 disease treated with partial nephrectomy in 2010 (P < 0.01). The 30-day mortality rate halved from 2.4% to 1.1% and 90-day mortality decreased from 4.9% to 2.6% (P < 0.01). The 1-year relative survival rate increased from 86.9% to 93.4%, whilst the 5-year relative survival rate rose from 68.2% to 81.2% (P < 0.01). Improvements were most notable in patients aged ≥65 years and those with T3 and T4 disease. CONCLUSIONS: Surgical RCC management has changed considerably with nephrectomy centralisation and increased NSS and MIS. In parallel, we observed significant improvements in short- and long-term survival particularly for elderly patients and those with locally advanced disease
Impact of Hospital Nephrectomy Volume on Intermediate to Long-term Survival in Renal Cell Carcinoma
OBJECTIVE
To evaluate the relationship between hospital volume and intermediate and long-term patient survival for patients undergoing nephrectomy for renal cell carcinoma (RCC).
PATIENTS & METHODS
Adult RCC patients treated with nephrectomy between 2000 and 2010 were identified from the English Hospital Episode Statistics and National Cancer Data Repository. Patients with nodal or metastatic disease were excluded. Hospitals were categorised into low (<20/yr), medium (20-39/yr) and high (40/yr) volume based on annual cases of RCC nephrectomy.
Multivariable Cox regressions were used to calculate hazard ratios for all-cause mortality by hospital volume, adjusting for patient, tumour and surgical characteristics. We assessed conditional survival over three follow-up periods: short (30d-1yr), intermediate (1-3yr) and long (3-5yr). We additionally explored whether associations between volume and outcomes varied by tumour stage.
RESULTS
12,912 patients were included. Patients in high volume hospitals had 34% reduction in mortality risks up to one year compared to those in low volume hospitals (HR 0.66, 95% CI 0.53-0.83, p<0.01). Assuming causality, treatment in high volume hospitals was associated with one fewer death in every 71 patients treated. Benefit of nephrectomy centralisation did not change with higher T stage (p=0.17). No significant association between hospital volume and survival was observed beyond the first year.
CONCLUSIONS
RCC nephrectomy in high volume hospitals was associated with improved survival for up to one year after treatment. Our results contribute new insights regarding the value of nephrectomy centralisation.The Urology Foundation
Addenbrooke's Charitable Trust
Royal College of Surgeons of Englan
Recommended from our members
The paradox of high greenness and poor health in rural Central Appalachia.
While many studies have found positive correlations between greenness and human health, rural Central Appalachia is an exception. The region has high greenness levels but poor health. The purpose of this commentary is to provide a possible explanation for this paradox: three sets of factors overwhelming or attenuating the health benefits of greenness. These include environmental (e.g., steep typography and limited access to green space used for outdoor recreation), social (e.g., chronic poverty, declining coal industry, and limited access to healthcare), and psychological and behavioral factors (e.g., perceptions about health behaviors, healthcare, and greenness). The influence of these factors on the expected health benefits of greenness should be considered as working hypotheses for future research. Policymakers and public health officials need to ensure that greenness-based interventions account for contextual factors and other determinants of health to ensure these interventions have the expected health benefits
Yield Losses and Control by Sedaxane and Fludioxonil of Soilborne Rhizoctonia, Microdochium, and Fusarium Species in Winter Wheat
Soilborne Rhizoctonia, Microdochium, and Fusarium species are major causal agents of seedling and stem-base diseases of wheat. Currently, seed treatments are considered the most effective solution for their control. Rhizoctonia solani anastomosis groups (AGs) 2-1 and 5, R. cerealis, Microdochium, and Fusarium spp., were used in series of field experiments to determine their capability to cause soilborne and stem-base disease and to quantify their comparative losses in the establishment and yield of wheat. The effectiveness and response to seed treatment formulated with 10 g sedaxane and 5 g fludioxonil 100 kg21 against these soilborne pathogens were also determined. Our results showed that damping-off caused by soilborne R. cerealis was associated with significant reductions in the emergence and establishment, resulting in stunted growth and low plant numbers. The pathogen also caused sharp eyespot associated with reductions in the ear partitioning index. R. solani AG 2-1 and AG 5 were weakly pathogenic and failed to cause significant damping-off, root rot, and stem-base disease in wheat. Fusarium graminearum and F. culmorum applied as soilborne inoculum failed to cause severe disease. Microdochium spp. caused brown foot rot disease and soilborne M. nivale reduced wheat emergence. Applications of sedaxane and fludioxonil increased plant emergence and reduced damping-off, early stem-base disease, and brown foot rot, thus providing protection against multiple soilborne pathogens. R. cerealis reduced the thousand grain weight by 3.6%, whereas seed treatment including fludioxonil and sedaxane against soilborne R. cerealis or M. nivale resulted in a 4% yield increase
Population dynamics of Rhizoctonia, Oculimacula, and Microdochium species in soil, roots, and stems of English wheat crops
© 2020 The Authors. Plant Pathology published by John Wiley & Sons Ltd on behalf of British Society for Plant Pathology This study aimed to elucidate the population dynamics of Rhizoctonia, Oculimacula, and Microdochium species, causing the stem base disease complex of sharp eyespot, eyespot, and brown foot rot in cereals. Pathogen DNA in soil, roots, and stem fractions, and disease expression were quantified in 102 English wheat fields in two seasons. Weather data for each site was collected to determine patterns that correlate with assessed diseases. Oculimacula spp. (66%) and R. solani AG 2-1 (63%) were most frequently detected in soil, followed by R. cerealis (54%) and Microdochium spp. (33%). Oculimacula spp. (89%) and R. cerealis (56%) predominated on roots and soil but were not associated with root rot symptoms, suggesting that these species used soil and roots for survival and as inoculum source. M. nivale was more frequently detected than M. majus on stems up to GS 21–30 and co-occurred on plant samples with O. acuformis. O. yallundae had higher DNA concentration than O. acuformis at the lower 5cm basal region at GS 37–45. R. cerealis predominated in the upper 15cm above the base beyond stem extension. Brown foot rot by Microdochium spp. was favoured by cool and wet autumns/winters and dominated in English wheat. Eyespot and sharp eyespot disease index by Oculimacula spp. and R. cerealis, respectively, correlated with wet/humid springs and summers. Results suggested that stem base pathogens generally coexisted; however, their abundance in time and space was influenced by favourable weather patterns and host development, with niche differentiation after stem extension
Investigation of the Proteolytic Functions of an Expanded Cercarial Elastase Gene Family in Schistosoma mansoni
Schistosome parasites are a major cause of disease in the developing world. The larval stage of the parasite transitions between an intermediate snail host and a definitive human host in a dramatic fashion, burrowing out of the snail and subsequently penetrating human skin. This process is facilitated by secreted proteases. In Schistosoma mansoni, cercarial elastase is the predominant secreted protease and essential for host skin invasion. Genomic analysis reveals a greatly expanded cercarial elastase gene family in S. mansoni. Despite sequence divergence, SmCE isoforms show similar expression profiles throughout the S. mansoni life cycle and have largely similar substrate specificities, suggesting that the majority of protease isoforms are functionally redundant and therefore their expansion is an example of gene dosage. However, activity-based profiling also indicates that a subset of SmCE isoforms are activated prior to the parasite's exit from its intermediate snail host, suggesting that the protease may also have a role in this process
The Matsu Wheel: A Cloud-Based Framework for Efficient Analysis and Reanalysis of Earth Satellite Imagery
Project Matsu is a collaboration between the Open Commons Consortium and NASA focused on developing open source technology for cloud-based processing of Earth satellite imagery with practical applications to aid in natural disaster detection and relief. Project Matsu has developed an open source cloud-based infrastructure to process, analyze, and reanalyze large collections of hyperspectral satellite image data using OpenStack, Hadoop, MapReduce and related technologies. We describe a framework for efficient analysis of large amounts of data called the Matsu "Wheel." The Matsu Wheel is currently used to process incoming hyperspectral satellite data produced daily by NASA's Earth Observing-1 (EO-1) satellite. The framework allows batches of analytics, scanning for new data, to be applied to data as it flows in. In the Matsu Wheel, the data only need to be accessed and preprocessed once, regardless of the number or types of analytics, which can easily be slotted into the existing framework. The Matsu Wheel system provides a significantly more efficient use of computational resources over alternative methods when the data are large, have high-volume throughput, may require heavy preprocessing, and are typically used for many types of analysis. We also describe our preliminary Wheel analytics, including an anomaly detector for rare spectral signatures or thermal anomalies in hyperspectral data and a land cover classifier that can be used for water and flood detection. Each of these analytics can generate visual reports accessible via the web for the public and interested decision makers. The result products of the analytics are also made accessible through an Open Geospatial Compliant (OGC)-compliant Web Map Service (WMS) for further distribution. The Matsu Wheel allows many shared data services to be performed together to efficiently use resources for processing hyperspectral satellite image data and other, e.g., large environmental datasets that may be analyzed for many purposes
Independent Predictors of Repeat Emergency Room Presentations: Insights from a Cohort of 1066 Consecutive Patients with Non-Cardiac Chest Pain Generating 4770 Repeat Presentations
Background and importanceChest pain (CP) is one of the most frequent presentations to the emergency department (ED), a large proportion of which is non-cardiac chest pain (NCCP). Repeat attendances to ED are common and impose considerable burden to overstretched departments.ObjectiveOur aim was to determine drivers for repeat ED presentations using NCCP as the primary cause of index presentation.Design, setting and participantsThis was a retrospective cohort study of 1066 consecutive presentations with NCCP to a major urban hospital ED in North England. Index of Multiple Deprivation (IMD), a postcode-derived validated index of deprivation, was computed. Charlson comorbidity index (CCI) was determined by reference to known comorbidity variables. Repeat presentation to ED to any national hospital was determined by a national linked database (population 53.5 million). Independent predictors of ED representation were computed using logistic regression analysis.ResultsMedian age was 43 (IQR 28-59), and 50.8% were male. Furthermore, 27.8%, 8.1% and 3.8% suffered from chronic obstructive pulmonary disease (COPD), hypertension and diabetes mellitus, respectively. The most frequent diagnoses, using ICD-10 coding, were non-cardiac chest pain (55.1%), followed by respiratory conditions (14.7%). One-year incidence of adjudicated myocardial infarction, urgent or emergency coronary revascularisation and all-cause death was 0.6%, 2% and 5.3%, respectively. There was a total of 4770 ED repeat presentations 1 year prior to or following index presentation with NCCP in this cohort. Independent (multivariate) predictors for frequent re-presentation (defined as ≥2 representations) were a history of COPD (OR [odds ratio] 2.06, p = 0.001), previous MI (OR3.6, p = 0.020) and a Charlson comorbidity index ≥1 (OR 1.51, p = 0.030). The frequency of previous MI was low as only 3% had sustained a previous MI.ConclusionsThis analysis indicates that COPD and complex health care needs (represented by high CCI), but not socio-economic deprivation, should be health policy targets for lessening repeat ED presentations. What is already known on this topic: Repeat presentations with non-ischaemic chest pain are common, placing a considerable burden on emergency departments.What this study addsCOPD and complex health care needs, denoted by Charlson comorbidity index, are implicated as drivers for repeat presentation to accident and emergency department. Socio-economic deprivation was not an independent predictor of re-presentation. How might this study affect research, practice, or policy: Community-based support for COPD and complex health care needs may reduce frequency of ED attendance
- …