527 research outputs found
Lotus tenuis tolerates combined salinity and waterlogging: maintaining O2 transport to roots and expression of an NHX1-like gene contribute to regulation of Na+ transport
Salinity and waterlogging interact to reduce growth for most crop and pasture species. The combination of these stresses often cause a large increase in the rate of Na+ and Cl− transport to shoots; however, the mechanisms responsible for this are largely unknown. To identify mechanisms contributing to the adverse interaction between salinity and waterlogging, we compared two Lotus species with contrasting tolerances when grown under saline (200 mM NaCl) and O2-deficient (stagnant) treatments. Measurements of radial O2 loss (ROL) under stagnant conditions indicated that more O2 reaches root tips of Lotus tenuis, compared with Lotus corniculatus. Better internal aeration would contribute to maintaining Na+ and Cl− transport processes in roots of L. tenuis exposed to stagnant-plus-NaCl treatments. L. tenuis root Na+ concentrations after stagnant-plus-NaCl treatment (200 mM) were 17% higher than L. corniculatus, with 55% of the total plant Na+ being accumulated in roots, compared with only 39% for L. corniculatus. L. tenuis accumulated more Na+ in roots, presumably in vacuoles, thereby reducing transport to the shoot (25% lower than L. corniculatus). A candidate gene for vacuole Na+ accumulation, an NHX1-like gene, was cloned from L. tenuis and identity established via sequencing and yeast complementation. Transcript levels of NHX1 in L. tenuis roots under stagnant-plus-NaCl treatment were the same as for aerated NaCl, whereas L. corniculatus roots had reduced transcript levels. Enhanced O2 transport to roots enables regulation of Na+ transport processes in L. tenuis roots, contributing to tolerance to combined salinity and waterlogging stresses
Hyperosmotic priming of arabidopsis seedlings establishes a long-term somatic memory accompanied by specific changes of the epigenome
<p>Background: In arid and semi-arid environments, drought and soil salinity usually occur at the beginning and end of a plant's life cycle, offering a natural opportunity for the priming of young plants to enhance stress tolerance in mature plants. Chromatin marks, such as histone modifications, provide a potential molecular mechanism for priming plants to environmental stresses, but whether transient exposure of seedlings to hyperosmotic stress leads to chromatin changes that are maintained throughout vegetative growth remains unclear.</p>
<p>Results: We have established an effective protocol for hyperosmotic priming in the model plant Arabidopsis, which includes a transient mild salt treatment of seedlings followed by an extensive period of growth in control conditions. Primed plants are identical to non-primed plants in growth and development, yet they display reduced salt uptake and enhanced drought tolerance after a second stress exposure. ChIP-seq analysis of four histone modifications revealed that the priming treatment altered the epigenomic landscape; the changes were small but they were specific for the treated tissue, varied in number and direction depending on the modification, and preferentially targeted transcription factors. Notably, priming leads to shortening and fractionation of H3K27me3 islands. This effect fades over time, but is still apparent after a ten day growth period in control conditions. Several genes with priming-induced differences in H3K27me3 showed altered transcriptional responsiveness to the second stress treatment.</p>
<p>Conclusion: Experience of transient hyperosmotic stress by young plants is stored in a long-term somatic memory comprising differences of chromatin status, transcriptional responsiveness and whole plant physiology.</p>
Natural variation of arabidopsis root architecture reveals complementing adaptive strategies to potassium starvation
Root architecture is a highly plastic and environmentally responsive trait that enables plants to counteract nutrient scarcities with different foraging strategies. In potassium (K) deficiency (low K), seedlings of the Arabidopsis (Arabidopsis thaliana) reference accession Columbia (Col-0) show a strong reduction of lateral root elongation. To date, it is not clear whether this is a direct consequence of the lack of K as an osmoticum or a triggered response to maintain the growth of other organs under limiting conditions. In this study, we made use of natural variation within Arabidopsis to look for novel root architectural responses to low K. A comprehensive set of 14 differentially responding root parameters were quantified in K-starved and K-replete plants. We identified a phenotypic gradient that links two extreme strategies of morphological adaptation to low K arising from a major tradeoff between main root (MR) and lateral root elongation. Accessions adopting strategy I (e.g. Col-0) maintained MR growth but compromised lateral root elongation, whereas strategy II genotypes (e.g. Catania-1) arrested MR elongation in favor of lateral branching. K resupply and histochemical staining resolved the temporal and spatial patterns of these responses. Quantitative trait locus analysis of K-dependent root architectures within a Col-0 × Catania-1 recombinant inbred line population identified several loci each of which determined a particular subset of root architectural parameters. Our results indicate the existence of genomic hubs in the coordinated control of root growth in stress conditions and provide resources to facilitate the identification of the underlying genes
The Histone Deacetylase Complex (HDC) 1 protein of Arabidopsis thaliana has the capacity to interact with multiple proteins including histone 3-binding proteins and histone 1 variants
Intrinsically disordered proteins can adopt multiple conformations thereby enabling interaction with a wide variety of partners. They often serve as hubs in protein interaction networks. We have previously shown that the Histone Deacetylase Complex (HDC) 1 protein from Arabidopsis thaliana interacts with histone deacetylases and quantitatively determines histone acetylation levels, transcriptional activity and several phenotypes, including ABA-sensitivity during germination, vegetative growth rate and flowering time. HDC1-type proteins are ubiquitous in plants but they contain no known structural or functional domains. Here we explored the protein interaction spectrum of HDC1. In addition to binding histone deacetylases, HDC1 directly interacted with core histone H3-binding proteins and co-repressor associated proteins, but not with H3 or the co-repressors themselves. Surprisingly, HDC1 was also able to interact with variants of the linker histone H1. Truncation of HDC1 to the ancestral core sequence narrowed the spectrum of interactions and of phenotypic outputs but maintained binding to a H3-binding protein and to H1. The results indicate a potential link between H1 and histone modifying complexes
Biodesalination: an emerging technology for targeted removal of Na+and Cl−from seawater by cyanobacteria
Although desalination by membrane processes is a possible solution to the problem of freshwater supply, related cost and energy demands prohibit its use on a global scale. Hence, there is an emerging necessity for alternative, energy and cost-efficient methods for water desalination. Cyanobacteria are oxygen-producing, photosynthetic bacteria that actively grow in vast blooms both in fresh and seawater bodies. Moreover, cyanobacteria can grow with minimal nutrient requirements and under natural sunlight. Taking these observations together, a consortium of five British Universities was formed to test the principle of using cyanobacteria as ion exchangers, for the specific removal of Na+ and Cl− from seawater. This project consisted of the isolation and characterisation of candidate strains, with central focus on their potential to be osmotically and ionically adaptable. The selection panel resulted in the identification of two Euryhaline strains, one of freshwater (Synechocystis sp. Strain PCC 6803) and one of marine origin (Synechococcus sp. Strain PCC 7002) (Robert Gordon University, Aberdeen). Other work packages were as follows. Genetic manipulations potentially allowed for the expression of a light-driven, Cl−-selective pump in both strains, therefore, enhancing the bioaccumulation of specific ions within the cell (University of Glasgow). Characterisation of surface properties under different salinities (University of Sheffield), ensured that cell–liquid separation efficiency would be maximised post-treatment, as well as monitoring the secretion of mucopolysaccharides in the medium during cell growth. Work at Newcastle University is focused on the social acceptance of this scenario, together with an assessment of the potential risks through the generation and application of a Hazard Analysis and Critical Control Points plan. Finally, researchers in Imperial College (London) designed the process, from biomass production to water treatment and generation of a model photobioreactor. This multimodal approach has produced promising first results, and further optimisation is expected to result in mass scaling of this process
Effects of Standard Labor-Wear on Swimming and Treading Water
We tested the hypothesis that occupational clothing would impair performance during swimming. The sub questions included: (1) Will the standard work wear of a railway worker or laborer impede swimming ability? (2) Will this clothing impact the individual’s ability to tread water? We addressed the research questions with three hypotheses. Analysis showed statistically significant p-values and all three null hypotheses were rejected in favor of the three research hypotheses, showing strong evidence that standard labor wear had adverse effects on 11.43 meter/12.5 yard swim time, water treading time and rate of perceived exertion (RPE) during water treading. The mean swim time more than doubled when the subjects wore standard labor-wear and their average rate of perceived exertion increased from 11.6 in standard swim wear to 17.1 in standard laborwear. It may be beneficial for those workers who work near water to be exposed to educational programs that allow in-water experiences so they develop an understanding of their abilities in, and respect for, the water
Report of the NIH task force on research standards for chronic low back pain
Abstract
Despite rapidly increasing intervention, functional disability due to chronic low back pain (cLBP) has increased in recent decades. We often cannot identify mechanisms to explain the major negative impact cLBP has on patients\u27 lives. Such cLBP is often termed non-specific, and may be due to multiple biologic and behavioral etiologies. Researchers use varied inclusion criteria, definitions, baseline assessments, and outcome measures, which impede comparisons and consensus. The NIH Pain Consortium therefore charged a Research Task Force (RTF) to draft standards for research on cLBP. The resulting multidisciplinary panel recommended using 2 questions to define cLBP; classifying cLBP by its impact (defined by pain intensity, pain interference, and physical function); use of a minimal data set to describe research participants (drawing heavily on the PROMIS methodology); reporting responder analyses in addition to mean outcome scores; and suggestions for future research and dissemination. The Pain Consortium has approved the recommendations, which investigators should incorporate into NIH grant proposals. The RTF believes these recommendations will advance the field, help to resolve controversies, and facilitate future research addressing the genomic, neurologic, and other mechanistic substrates of chronic low back pain. We expect the RTF recommendations will become a dynamic document, and undergo continual improvement. PERSPECTIVE:
A Task Force was convened by the NIH Pain Consortium, with the goal of developing research standards for chronic low back pain. The results included recommendations for definitions, a minimal dataset, reporting outcomes, and future research. Greater consistency in reporting should facilitate comparisons among studies and the development of phenotypes
Computed tomography-osteoabsorptiometry for assessing the density distribution of subchondral bone as a measure of long-term mechanical adaptation in individual joints
To estimate subchondral mineralisation patterns which represent the long-term loading history of individual joints, a method has been developed employing computed tomography (CT) which permits repeated examination of living joints. The method was tested on 5 knee, 3 sacroiliac, 3 ankle and 5 shoulder joints and then investigated with X-ray densitometry. A CT absorptiometric presentation and maps of the area distribution of the subchondral bone density areas were derived using an image analyser. Comparison of the results from both X-ray densitometry and CT-absorptiometry revealed almost identical pictures of distribution of the subchondral bone density. The method may be used to examine subchondral mineralisation as a measure of the mechanical adaptability of joints in the living subject
- …