1,044 research outputs found

    Local adaptations to frost in marginal and central populations of the dominant forest tree Fagus sylvatica L. as affected by temperature and extreme drought in common garden experiments

    Get PDF
    Local adaptations to environmental conditions are of high ecological importance as they determine distribution ranges and likely affect species responses to climate change. Increased environmental stress (warming, extreme drought) due to climate change in combination with decreased genetic mixing due to isolation may lead to stronger local adaptations of geographically marginal than central populations. We experimentally observed local adaptations of three marginal and four central populations of Fagus sylvaticaL., the dominant native forest tree, to frost over winter and in spring (late frost). We determined frost hardiness of buds and roots by the relative electrolyte leakage in two common garden experiments. The experiment at the cold site included a continuous warming treatment; the experiment at the warm site included a preceding summer drought manipulation. In both experiments, we found evidence for local adaptation to frost, with stronger signs of local adaptation in marginal populations. Winter frost killed many of the potted individuals at the cold site, with higher survival in the warming treatment and in those populations originating from colder environments. However, we found no difference in winter frost tolerance of buds among populations, implying that bud survival was not the main cue for mortality. Bud late frost tolerance in April differed between populations at the warm site, mainly because of phenological differences in bud break. Increased spring frost tolerance of plants which had experienced drought stress in the preceding summer could also be explained by shifts in phenology. Stronger local adaptations to climate in geographically marginal than central populations imply the potential for adaptation to climate at range edges. In times of climate change, however, it needs to be tested whether locally adapted populations at range margins can successfully adapt further to changing conditions

    Assessment of Health-Related Quality of Life after TBI: Comparison of a Disease-Specific (QOLIBRI) with a Generic (SF-36) Instrument

    Get PDF
    Psychosocial, emotional, and physical problems can emerge after traumatic brain njury (TBI), potentially impacting health-related quality of life (HRQoL). Until now, however, neither the discriminatory power of disease-specific (QOLIBRI) and generic (SF-36) HRQoL nor their correlates have been compared in detail. These aspects as well as some psychometric item characteristics were studied in a sample of 795 TBI survivors. The Shannon H耠 index absolute informativity, as an indicator of an instrument’s power to differentiate between individualswithin a specific group or health state,was investigated. Psychometric performance of the two instruments was predominantly good, generally higher, and more homogenous for the QOLIBRI than for the SF-36 subscales. Notably, the SF-36 “Role Physical,” “Role Emotional,” and “Social Functioning” subscales showed less satisfactory discriminatory power than all other dimensions or the sum scores of both instruments. The absolute informativity of disease-specific as well as generic HRQoL instruments concerning the different groups defined by different correlates differed significantly.When the focus is on how a certain subscale or sum score differentiates between individuals in one specific dimension/health state, the QOLIBRI can be recommended as the preferable instrument.Psychosocial, emotional, and physical problems can emerge after traumatic brain injury (TBI), potentially impacting health-related quality of life (HRQoL). Until now, however, neither the discriminatory power of disease-specific (QOLIBRI) and generic (SF-36) HRQoL nor their correlates have been compared in detail. These aspects as well as some psychometric item characteristics were studied in a sample of 795 TBI survivors. The Shannon H耠 index absolute informativity, as an indicator of an instrument’s power to differentiate between individualswithin a specific group or health state,was investigated. Psychometric performance of the two instruments was predominantly good, generally higher, and more homogenous for the QOLIBRI than for the SF-36 subscales. Notably, the SF-36 “Role Physical,” “Role Emotional,” and “Social Functioning” subscales showed less satisfactory discriminatory power than all other dimensions or the sum scores of both instruments. The absolute informativity of disease-specific as well as generic HRQoL instruments concerning the different groups defined by different correlates differed significantly.When the focus is on how a certain subscale or sum score differentiates between individuals in one specific dimension/health state, the QOLIBRI can be recommended as the preferable instrument.Peer reviewe

    Challenges in Ceramic Science: A Report from the Workshop on Emerging Research Areas in Ceramic Science

    Get PDF
    In March 2012, a group of researchers met to discuss emerging topics in ceramic science and to identify grand challenges in the field. By the end of the workshop, the group reached a consensus on eight challenges for the future:—understanding rare events in ceramic microstructures, understanding the phase-like behavior of interfaces, predicting and controlling heterogeneous microstructures with unprecedented functionalities, controlling the properties of oxide electronics, understanding defects in the vicinity of interfaces, controlling ceramics far from equilibrium, accelerating the development of new ceramic materials, and harnessing order within disorder in glasses. This paper reports the outcomes of the workshop and provides descriptions of these challenges

    Modularization of biochemical networks based on classification of Petri net t-invariants

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Structural analysis of biochemical networks is a growing field in bioinformatics and systems biology. The availability of an increasing amount of biological data from molecular biological networks promises a deeper understanding but confronts researchers with the problem of combinatorial explosion. The amount of qualitative network data is growing much faster than the amount of quantitative data, such as enzyme kinetics. In many cases it is even impossible to measure quantitative data because of limitations of experimental methods, or for ethical reasons. Thus, a huge amount of qualitative data, such as interaction data, is available, but it was not sufficiently used for modeling purposes, until now. New approaches have been developed, but the complexity of data often limits the application of many of the methods. Biochemical Petri nets make it possible to explore static and dynamic qualitative system properties. One Petri net approach is model validation based on the computation of the system's invariant properties, focusing on t-invariants. T-invariants correspond to subnetworks, which describe the basic system behavior.</p> <p>With increasing system complexity, the basic behavior can only be expressed by a huge number of t-invariants. According to our validation criteria for biochemical Petri nets, the necessary verification of the biological meaning, by interpreting each subnetwork (t-invariant) manually, is not possible anymore. Thus, an automated, biologically meaningful classification would be helpful in analyzing t-invariants, and supporting the understanding of the basic behavior of the considered biological system.</p> <p>Methods</p> <p>Here, we introduce a new approach to automatically classify t-invariants to cope with network complexity. We apply clustering techniques such as UPGMA, Complete Linkage, Single Linkage, and Neighbor Joining in combination with different distance measures to get biologically meaningful clusters (t-clusters), which can be interpreted as modules. To find the optimal number of t-clusters to consider for interpretation, the cluster validity measure, Silhouette Width, is applied.</p> <p>Results</p> <p>We considered two different case studies as examples: a small signal transduction pathway (pheromone response pathway in <it>Saccharomyces cerevisiae</it>) and a medium-sized gene regulatory network (gene regulation of Duchenne muscular dystrophy). We automatically classified the t-invariants into functionally distinct t-clusters, which could be interpreted biologically as functional modules in the network. We found differences in the suitability of the various distance measures as well as the clustering methods. In terms of a biologically meaningful classification of t-invariants, the best results are obtained using the Tanimoto distance measure. Considering clustering methods, the obtained results suggest that UPGMA and Complete Linkage are suitable for clustering t-invariants with respect to the biological interpretability.</p> <p>Conclusion</p> <p>We propose a new approach for the biological classification of Petri net t-invariants based on cluster analysis. Due to the biologically meaningful data reduction and structuring of network processes, large sets of t-invariants can be evaluated, allowing for model validation of qualitative biochemical Petri nets. This approach can also be applied to elementary mode analysis.</p

    An alternative approach to risk rank chemicals on the threat they pose to the aquatic environment

    Get PDF
    This work presents a new and unbiased method of risk ranking chemicals based on the threat they pose to the aquatic environment. The study ranked 12 metals, 23 pesticides, 11 other persistent organic pollutants (POPs), 13 pharmaceuticals, 10 surfactants and similar compounds and 2 nanoparticles (total of 71) of concern against one another by comparing their median UK river water and median ecotoxicity effect concentrations. To complement this, by giving an assessment on potential wildlife impacts, risk ranking was also carried out by comparing the lowest 10th percentile of the effects data with the highest 90th percentile of the exposure data. In other words, risk was pared down to just toxicity versus exposure. Further modifications included incorporating bioconcentration factors, using only recent water measurements and excluding either lethal or sub-lethal effects. The top ten chemicals, based on the medians, which emerged as having the highest risk to organisms in UK surface waters using all the ecotoxicity data were copper, aluminium, zinc, ethinylestradiol (EE2), linear alkylbenzene sulfonate (LAS), triclosan, manganese, iron, methomyl and chlorpyrifos. By way of contrast, using current UK environmental quality standards as the comparator to median UK river water concentrations would have selected 6 different chemicals in the top ten. This approach revealed big differences in relative risk; for example, zinc presented a million times greater risk then metoprolol and LAS 550 times greater risk than nanosilver. With the exception of EE2, most pharmaceuticals were ranked as having a relatively low risk.Open Access funded by Natural Environment Research Council. We would like to thank the UK's Department for Environment, Food and Rural Affairs for funding this project (CB0462). The views expressed here are of the authors alone. We would also like to thank colleagues at Brunel University and CEH for their advice on the project

    Measurement of t(t)over-bar normalised multi-differential cross sections in pp collisions at root s=13 TeV, and simultaneous determination of the strong coupling strength, top quark pole mass, and parton distribution functions

    Get PDF
    Peer reviewe

    An embedding technique to determine ττ backgrounds in proton-proton collision data

    Get PDF
    An embedding technique is presented to estimate standard model tau tau backgrounds from data with minimal simulation input. In the data, the muons are removed from reconstructed mu mu events and replaced with simulated tau leptons with the same kinematic properties. In this way, a set of hybrid events is obtained that does not rely on simulation except for the decay of the tau leptons. The challenges in describing the underlying event or the production of associated jets in the simulation are avoided. The technique described in this paper was developed for CMS. Its validation and the inherent uncertainties are also discussed. The demonstration of the performance of the technique is based on a sample of proton-proton collisions collected by CMS in 2017 at root s = 13 TeV corresponding to an integrated luminosity of 41.5 fb(-1).Peer reviewe

    Search for new particles in events with energetic jets and large missing transverse momentum in proton-proton collisions at root s=13 TeV

    Get PDF
    A search is presented for new particles produced at the LHC in proton-proton collisions at root s = 13 TeV, using events with energetic jets and large missing transverse momentum. The analysis is based on a data sample corresponding to an integrated luminosity of 101 fb(-1), collected in 2017-2018 with the CMS detector. Machine learning techniques are used to define separate categories for events with narrow jets from initial-state radiation and events with large-radius jets consistent with a hadronic decay of a W or Z boson. A statistical combination is made with an earlier search based on a data sample of 36 fb(-1), collected in 2016. No significant excess of events is observed with respect to the standard model background expectation determined from control samples in data. The results are interpreted in terms of limits on the branching fraction of an invisible decay of the Higgs boson, as well as constraints on simplified models of dark matter, on first-generation scalar leptoquarks decaying to quarks and neutrinos, and on models with large extra dimensions. Several of the new limits, specifically for spin-1 dark matter mediators, pseudoscalar mediators, colored mediators, and leptoquarks, are the most restrictive to date.Peer reviewe

    Combined searches for the production of supersymmetric top quark partners in proton-proton collisions at root s=13 TeV

    Get PDF
    A combination of searches for top squark pair production using proton-proton collision data at a center-of-mass energy of 13 TeV at the CERN LHC, corresponding to an integrated luminosity of 137 fb(-1) collected by the CMS experiment, is presented. Signatures with at least 2 jets and large missing transverse momentum are categorized into events with 0, 1, or 2 leptons. New results for regions of parameter space where the kinematical properties of top squark pair production and top quark pair production are very similar are presented. Depending on themodel, the combined result excludes a top squarkmass up to 1325 GeV for amassless neutralino, and a neutralinomass up to 700 GeV for a top squarkmass of 1150 GeV. Top squarks with masses from 145 to 295 GeV, for neutralino masses from 0 to 100 GeV, with a mass difference between the top squark and the neutralino in a window of 30 GeV around the mass of the top quark, are excluded for the first time with CMS data. The results of theses searches are also interpreted in an alternative signal model of dark matter production via a spin-0 mediator in association with a top quark pair. Upper limits are set on the cross section for mediator particle masses of up to 420 GeV

    Bose-Einstein correlations of charged hadrons in proton-proton collisions at s\sqrt s = 13 TeV

    Get PDF
    Bose-Einstein correlations of charged hadrons are measured over a broad multiplicity range, from a few particles up to about 250 reconstructed charged hadrons in proton-proton collisions at s \sqrt{s} = 13 TeV. The results are based on data collected using the CMS detector at the LHC during runs with a special low-pileup configuration. Three analysis techniques with different degrees of dependence on simulations are used to remove the non-Bose-Einstein background from the correlation functions. All three methods give consistent results. The measured lengths of homogeneity are studied as functions of particle multiplicity as well as average pair transverse momentum and mass. The results are compared with data from both CMS and ATLAS at s \sqrt{s} = 7 TeV, as well as with theoretical predictions.[graphic not available: see fulltext]Bose-Einstein correlations of charged hadrons are measured over a broad multiplicity range, from a few particles up to about 250 reconstructed charged hadrons in proton-proton collisions at s=\sqrt{s} = 13 TeV. The results are based on data collected using the CMS detector at the LHC during runs with a special low-pileup configuration. Three analysis techniques with different degrees of dependence on simulations are used to remove the non-Bose-Einstein background from the correlation functions. All three methods give consistent results. The measured lengths of homogeneity are studied as functions of particle multiplicity as well as average pair transverse momentum and mass. The results are compared with data from both CMS and ATLAS at s=\sqrt{s} = 7 TeV, as well as with theoretical predictions
    corecore